11+ QMC Jobs in India
Apply to 11+ QMC Jobs on CutShort.io. Find your next job, effortlessly. Browse QMC Jobs and apply today!
Business Intelligence Consultant – Qlik
Role
· Working through customer specifications and develop solutions in line with defined requirements
· Strategizing and ideating the solution design (create prototypes and/or wireframes) before beginning to create the application or solution.
· Creating load scripts and QVDs to support dashboards.
· Creating data models in Qlik Sense to support dashboards.
· Leading data discovery, assessment, analysis, modeling and mapping efforts for Qlik dashboards.
· Develop visual reports, dashboards and KPI scorecards using Qlik
· Connecting to data sources ( MS SQL SERVER, ORACLE, SAP), importing data and transforming data for Business Intelligence.
· Translating data into informative visuals and reports.
· Developing, publishing and scheduling reports as per the business requirements.
· Implementing application security layer models in Qlik
Skills Required
· Knowledge of data visualization and data analytics principles and skills –including good user experience/UI Design
· Hands-on developer on Qlik Sense development
· Knowledge of writing SQL queries
· Exceptional analytical skills, problem-solving skills and excellent communication skills
Qualifications
1. Degree in Computer Science Engineering disciplines or MCA
2. 2-4 years of hands-on Qlik experience
3. Qlik Sense certification would be preferred
- Extensive exposure to at least one Business Intelligence Platform (if possible, QlikView/Qlik Sense) – if not Qlik, ETL tool knowledge, ex- Informatica/Talend
- At least 1 Data Query language – SQL/Python
- Experience in creating breakthrough visualizations
- Understanding of RDMS, Data Architecture/Schemas, Data Integrations, Data Models and Data Flows is a must
• Development/Testing/Processes
• Developing effective QlikView/ Sense data models
• Developing front end applications using Qlik technology
• Utilizing scripting language to meet complex business requirements
• Utilizing Qlik Publisher / N printing capabilities
• Extract, transform and load (ETL) data from multiple data sources into the Qlik application
• Design, build, test and debug Qlik solutions based upon specified
requirements
• Follow implementation standards
• Utilize source control tools
• Follow deployment process
• Experience creating extract/transform/load routines from data sources
including SAP BW, SAP R/3, MS SQL Server, DB2, Oracle as well as other data sources
• Solid experience developing complex Qlik data models
Specific Responsibilities:
• Participating in business requirements and design review sessions
• Providing input on proposing, evaluating, and selecting appropriate
design alternatives which meet requirements and are consistent with our
current standards and processes
• Extracting, transforming and loading data into Qlik applications
• Developing, testing, debugging Qlik applications
• Migrating code across development and testing landscapes
• Creating publisher jobs
• Developing documentation
• Transferring knowledge and landing application to BI Support team
• Good communication skills and ability to interact with the customer
• Willingness to travel is mandatory
• Experience on Qlik sense, Geo Analytics an added advantage
JOB SUMMARY: The Senior Associate supports the Data Analytics Manager by proposing relevant analytics procedures/tools, executing the analytics and also developing visualization outputs for audits, continuous monitoring/auditing and IA initiatives. The individual’s responsibilities include -
Understanding audit and/or project objectives and assisting the manager in preparing the plan and timelines.
Working with the Process/BU/IA teams for gathering requirements for continuous monitoring/auditing projects.
Working with Internal audit project teams to understand the analytics requirements for audit engagements.
Independently build pilot/prototype, determine appropriate visual tool and design the views to meet project objectives.
Proficient in data management and data mining.
Highly skilled on visualization tools like Qlik View, Qlik Sense, Power BI, Tableau, Alteryx etc.
Working with Data Analytics Manager to develop analytics program aligned to the overall audit plan.
Showcasing analytics capability to Process management teams to increase adoption of continuous monitoring.
Establishing and maintaining relationships with all key stakeholders of internal audit.
Coaching other data analysts on analytics procedures, coding and tools.
Taking a significant and active role in developing and driving Internal Audit Data Analytics quality and knowledge sharing to enhance the value provided to Internal Audit stakeholders.
Ensuring timely and accurate time tracking.
Continuously focusing on self-development by attending trainings, seminars and acquiring relevant certifications.
As a Power BI and QlikView Developer, we expect the candidate to be a key contributor in the implementation of data analytics dashboards – from data preparation to dashboard development, unit testing and deployment. The primary work focus for the candidate is as under:
- Understanding the database design.
- Develop efficient SQL queries from simple to complex and test the data output as a part of data preparation activity
- Development & Unit Testing – Dashboards & Data Visualizations using Power BI
- Troubleshooting/debugging and rectifying issues
- Review, feedback and mentoring the team
- Adherence of standards and best practices as defined by the company at the individual level and as a team
QUALIFICATIONS AND EXPERIENCE
- Degree in BE/ BTech with at least 3 to 5 years of overall experience
- Experience in working on multiple databases like MS SQL (Mandatory), PostgreSQL, Mongo DB, MySQL (would be a plus) etc. and data analytics projects (a minimum of 2 to 3).
- Experience working on SQL Server: Writing queries of 1 to 2 years is mandatory.
- Expertise in Power BI and knowledge of QlikView (1-2 years).
- Understanding other tools like Tableau, Domo etc. would be a great plus
- Experience in working on different types of visualizations, in addition to the generic ones – Scatter plots, Heat Maps, Geo maps, Gantt, Bubbles, Tree Maps etc.
- Experience working on Trends, forecasting etc. would be a plus
- Experience in working with projects teams and ensuring the successful delivery of the solution
• Working Knowledge of XML, JSON, Shell and other DBMS scripts
• Hands on Experience on Oracle 11G,12c. Working knowledge of Oracle 18 and 19c
• Analysis, design, coding, testing, debugging and documentation. Complete knowledge of
Software Development Life Cycle (SDLC).
• Writing Complex Queries, stored procedures, functions and packages
• Knowledge of REST Services, UTL functions, DBMS functions and data integration is required
• Good knowledge on table level partitions, row locks and experience in OLTP.
• Should be aware about ETL tools, Data Migration, Data Mapping functionalities
• Understand the business requirement, transform/design the same into business solutions.
Perform data modelling and implement the business rules using Oracle database objects.
• Define source to target data mapping and data transformation logic as per the business
need.
• Should have worked on Materialised views creation and maintenance. Experience in
Performance tuning, impact analysis required
• Monitoring and optimizing the performance of the database. Planning for backup and
recovery of database information. Maintaining archived data. Backing up and restoring
databases.
• Hands on Experience on SQL Developer
We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description
Experience : 4-10 years
Location : Pune
Mandatory Skills -
- Strong in ETL/SQL development
- Strong Data Warehousing skills
- Hands-on experience working with Unix/Linux
- Development experience in Enterprise Data warehouse projects
- Good to have experience working with Python, shell scripting
Opportunities -
- Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
http://www.datametica.com/">www.datametica.com
• Experienced Developer in Shell scripting,
• PERL Scripting
• PL/SQL knowledge is required.
• Advance Communication skill is a must.
• Ability to learn new applications and technologies
Position: Big Data Engineer
What You'll Do
Punchh is seeking to hire Big Data Engineer at either a senior or tech lead level. Reporting to the Director of Big Data, he/she will play a critical role in leading Punchh’s big data innovations. By leveraging prior industrial experience in big data, he/she will help create cutting-edge data and analytics products for Punchh’s business partners.
This role requires close collaborations with data, engineering, and product organizations. His/her job functions include
- Work with large data sets and implement sophisticated data pipelines with both structured and structured data.
- Collaborate with stakeholders to design scalable solutions.
- Manage and optimize our internal data pipeline that supports marketing, customer success and data science to name a few.
- A technical leader of Punchh’s big data platform that supports AI and BI products.
- Work with infra and operations team to monitor and optimize existing infrastructure
- Occasional business travels are required.
What You'll Need
- 5+ years of experience as a Big Data engineering professional, developing scalable big data solutions.
- Advanced degree in computer science, engineering or other related fields.
- Demonstrated strength in data modeling, data warehousing and SQL.
- Extensive knowledge with cloud technologies, e.g. AWS and Azure.
- Excellent software engineering background. High familiarity with software development life cycle. Familiarity with GitHub/Airflow.
- Advanced knowledge of big data technologies, such as programming language (Python, Java), relational (Postgres, mysql), NoSQL (Mongodb), Hadoop (EMR) and streaming (Kafka, Spark).
- Strong problem solving skills with demonstrated rigor in building and maintaining a complex data pipeline.
- Exceptional communication skills and ability to articulate a complex concept with thoughtful, actionable recommendations.
at Velocity Services
We are an early stage start-up, building new fintech products for small businesses. Founders are IIT-IIM alumni, with prior experience across management consulting, venture capital and fintech startups. We are driven by the vision to empower small business owners with technology and dramatically improve their access to financial services. To start with, we are building a simple, yet powerful solution to address a deep pain point for these owners: cash flow management. Over time, we will also add digital banking and 1-click financing to our suite of offerings.
We have developed an MVP which is being tested in the market. We have closed our seed funding from marquee global investors and are now actively building a world class tech team. We are a young, passionate team with a strong grip on this space and are looking to on-board enthusiastic, entrepreneurial individuals to partner with us in this exciting journey. We offer a high degree of autonomy, a collaborative fast-paced work environment and most importantly, a chance to create unparalleled impact using technology.
Reach out if you want to get in on the ground floor of something which can turbocharge SME banking in India!
Technology stack at Velocity comprises a wide variety of cutting edge technologies like, NodeJS, Ruby on Rails, Reactive Programming,, Kubernetes, AWS, NodeJS, Python, ReactJS, Redux (Saga) Redis, Lambda etc.
Key Responsibilities
-
Responsible for building data and analytical engineering pipelines with standard ELT patterns, implementing data compaction pipelines, data modelling and overseeing overall data quality
-
Work with the Office of the CTO as an active member of our architecture guild
-
Writing pipelines to consume the data from multiple sources
-
Writing a data transformation layer using DBT to transform millions of data into data warehouses.
-
Implement Data warehouse entities with common re-usable data model designs with automation and data quality capabilities
-
Identify downstream implications of data loads/migration (e.g., data quality, regulatory)
What To Bring
-
3+ years of software development experience, a startup experience is a plus.
-
Past experience of working with Airflow and DBT is preferred
-
2+ years of experience working in any backend programming language.
-
Strong first-hand experience with data pipelines and relational databases such as Oracle, Postgres, SQL Server or MySQL
-
Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development)
-
Experienced with the formulation of ideas; building proof-of-concept (POC) and converting them to production-ready projects
-
Experience building and deploying applications on on-premise and AWS or Google Cloud cloud-based infrastructure
-
Basic understanding of Kubernetes & docker is a must.
-
Experience in data processing (ETL, ELT) and/or cloud-based platforms
-
Working proficiency and communication skills in verbal and written English.