11+ Data manipulation Jobs in Mumbai | Data manipulation Job openings in Mumbai
Apply to 11+ Data manipulation Jobs in Mumbai on CutShort.io. Explore the latest Data manipulation Job opportunities across top companies like Google, Amazon & Adobe.
Job Responsibilities:
Excellent problem solving and analytical skills - ability to develop hypotheses,
understand and interpret data within the context of the product / business -
solve problems and distill data into actionable recommendations.
Strong communication skills with the ability to confidently work with cross-
functional teams across the globe and to present information to all levels of the
organization.
Intellectual and analytical curiosity - initiative to dig into the why, what & how.
Strong number crunching and quantitative skills.
Advanced knowledge of MS Excel and PowerPoint.
Good hands on SQL
Experience with in Google Analytics, Optimize, Tag Manager and other Google Suite tools
Understanding of Business analytics tools & statistical programming languages - R, SAS, SPSS, Tableau is a plus
Inherent interest in e-commerce & marketplace technology platforms and
broadly in the consumer Internet & mobile space.
Previous experience of 1+ years working in a product company in
a product analytics role
Strong understanding of building and interpreting product funnels.
Perquisites & Benefits:
Opportunity to work with India's no.1 crowdfunding platform
Be a part of a young, smart and rapidly growing team with management from Ivy League and Premier colleges
Competitive compensation and incentives
Fun, casual, relaxed and flexible work environment
1. Bridging the gap between IT and the business using data analytics to assess processes, determine requirements and deliver data-driven recommendations and reports to executives and stakeholders.
2. Ability to search, extract, transform and load data from various databases, cleanse and refine data until it is fit-for-purpose
3. Work within various time constraints to meet critical business needs, while measuring and identifying activities performed and ensuring service requirements are met
4. Prioritization of issues to meet deadlines while ensuring high-quality delivery
5. Ability to pull data and to perform ad hoc reporting and analysis as needed
6. Ability to adapt quickly to new and changing technical environments as well as strong analytical and problem-solving abilities
7. Strong interpersonal and presentation skills
SKILLS:
1. Advanced skills in designing reporting interfaces and interactive dashboards in Google Sheets and Excel
2. Experience working with senior decision-makers
3. Strong advanced SQL/MySQL and Python skills with the ability to fetch data from the Data Warehouse as per the stakeholder's requirement
4. Good Knowledge and experience in Excel VBA and advanced excel
5. Good Experience in building Tableau analytical Dashboards as per the stake holder's reporting requirements
6. Strong communication/interpersonal skills
PERSONA:
1. Experience in working on adhoc requirements
2. Ability to toggle around with shifting priorities
3. Experience in working for Fintech or E-commerce industry is preferable
4. Engineering 2+ years of experience as a Business Analyst for the finance processes
DATA ANALYST
About:
We allows customers to "buy now and pay later" for goods and services purchased online and offline portals. It's a rapidly growing organization opening up new avenues of payments for online and offline customers. |
Role:
Define and continuously refine the analytics roadmap. Build, Deploy and Maintain the data infrastructure that supports all of the analysis, including the data warehouse and various data marts Build, deploy and maintain the predictive models and scoring infrastructure that powers critical decision management systems. Strive to devise ways to gather more alternate data and build increasingly enhanced predictive models Partner with business teams to systematically design experiments to continuously improve customer acquisition, minimize churn, reduce delinquency and improve profitability Provide data insights to all business teams through automated queries, MIS, etc. |
Requirements:
4+ years of deep, hands-on analytics experience in a management consulting, start-up or financial services, or fintech company. Should have strong knowledge in SQL and Python. Deep knowledge of problem-solving approach using analytical frameworks. Deep knowledge of frameworks for data management, deployment, and monitoring of performance metrics. Hands-on exposure to delivering improvements through test and learn methodologies. Excellent communication and interpersonal skills, with the ability to be pleasantly persistent. |
Location-MUMBAI
About the company
WonDRx (pronounced as Wonder-Rx) - spearheaded by two serial Entrepreneurs Mr. Pankaj Sindhu along with his Co-Founder Mr. Pankaj Agrawal; is a very unique and novel technology which has the capacity to change the way healthcare eco-system interacts with each-other. We are determined to change the way of Healthcare domain working and make it more convenient while ensuring a pleasant experience for the consumers and all other stakeholders involved. You need to see the product to believe the power it has and what it can enable in the healthcare domain.
Designation: Team Leader Data Analyst & Project Management
Job Description:
We are seeking a highly skilled and detail-oriented Team Leader- Data Analyst & Project to join our team. In this role, you will be responsible for extracting insights from raw data, preparing analytics reports, and overseeing data-driven projects from start to finish.
The ideal candidate should have a strong background in Advanced Excel, proficiency in various Business Intelligence (BI) tools will be an added advantage, and experience in follow up with various internal and external stakeholders related to ongoing projects.
Responsibilities:
• Gather, clean, and validate raw data from various sources.
• Develop and maintain efficient data management systems.
• Apply advanced Excel techniques to manipulate and analyse data effectively.
• Utilize BI tools to create insightful reports, visualizations, and dashboards.
• Identify trends, patterns, and correlations in data sets.
• Collaborate with cross-functional teams to understand their data needs and provide analytical support.
• Present findings and recommendations to stakeholders in a clear and understandable manner.
• Manage data-driven projects from initiation to completion, ensuring deliverables are met on time and within budget.
• Maintain project plans, including timelines, resource allocation, and task assignments.
• Monitor project progress, identify risks, and implement mitigation strategies.
• Coordinate and communicate with team members, ensuring alignment and effective collaboration.
• Prepare project status reports and deliver presentations to stakeholders.
• Evaluate project outcomes and identify areas for improvement.
Requirements:
• Master’s or Bachelor's degree with a relevant experience in coordinating Data Driven projects
• Proficiency in Advanced Excel functions and formulas.
• Knowledge of at least one Business Intelligence (BI) tool (e.g., Tableau, Power BI, QlikView) will be an added advantage
• Excellent project management skills, including planning, organization, and prioritization.
• Strong problem-solving and critical-thinking abilities.
• Excellent communication and interpersonal skills.
• Ability to work independently and manage multiple projects simultaneously.
• Attention to detail and ability to work with complex datasets.
Preferred Skills:
• Data Analysis: Candidate should have strong analytical skills which is crucial for a data analyst. You should be able to collect, organize, and interpret complex data sets to extract meaningful insights. Advanced Excel is a must.
• Communication and Presentation: Data analysts should be able to effectively communicate their findings to both technical and non-technical stakeholders. You would need to effectively communicate with team members, stakeholders, and other project stakeholders. This includes written communication through project documentation and reports, as well as verbal communication for conducting meetings, providing updates, and resolving issues.
• Continuous Learning: We are a constantly evolving startup. Ideal candidate should have willingness to learn new techniques, stay updated with the latest tools and technologies, and adapt to changing trends which is crucial for long-term success.
• Team Player: You should be able to motivate and inspire team members, provide guidance, and facilitate decision-making processes.
• Project Coordinator: As a project coordinator you should possess strong problem-solving and critical-thinking skills to identify issues, analyse root causes, and propose and implement appropriate solutions
PLSQL Developer
experience of 4 to 6 years
Skills- MS SQl Server and Oracle, AWS or Azure
• Experience in setting up RDS service in cloud technologies such as AWS or Azure
• Strong proficiency with SQL and its variation among popular databases
• Should be well-versed in writing stored procedures, functions, packages, using collections,
• Skilled at optimizing large, complicated SQL statements.
• Should have worked in migration projects.
• Should have worked on creating reports.
• Should be able to distinguish between normalized and de-normalized data modelling designs and use cases.
• Knowledge of best practices when dealing with relational databases
• Capable of troubleshooting common database issues
• Familiar with tools that can aid with profiling server resource usage and optimizing it.
• Proficient understanding of code versioning tools such as Git and SVN
at Magic9 Media and Consumer Knowledge Pvt. Ltd.
Job Description
This requirement is to service our client which is a leading big data technology company that measures what viewers consume across platforms to enable marketers make better advertising decisions. We are seeking a Senior Data Operations Analyst to mine large-scale datasets for our client. Their work will have a direct impact on driving business strategies for prominent industry leaders. Self-motivation and strong communication skills are both must-haves. Ability to work in a fast-paced work environment is desired.
Problems being solved by our client:
Measure consumer usage of devices linked to the internet and home networks including computers, mobile phones, tablets, streaming sticks, smart TVs, thermostats and other appliances. There are more screens and other connected devices in homes than ever before, yet there have been major gaps in understanding how consumers interact with this technology. Our client uses a measurement technology to unravel dynamics of consumers’ interactions with multiple devices.
Duties and responsibilities:
- The successful candidate will contribute to the development of novel audience measurement and demographic inference solutions.
- Develop, implement, and support statistical or machine learning methodologies and processes.
- Build, test new features and concepts and integrate into production process
- Participate in ongoing research and evaluation of new technologies
- Exercise your experience in the development lifecycle through analysis, design, development, testing and deployment of this system
- Collaborate with teams in Software Engineering, Operations, and Product Management to deliver timely and quality data. You will be the knowledge expert, delivering quality data to our clients
Qualifications:
- 3-5 years relevant work experience in areas as outlined below
- Experience in extracting data using SQL from large databases
- Experience in writing complex ETL processes and frameworks for analytics and data management. Must have experience in working on ETL tools.
- Master’s degree or PhD in Statistics, Data Science, Economics, Operations Research, Computer Science, or a similar degree with a focus on statistical methods. A Bachelor’s degree in the same fields with significant, demonstrated professional research experience will also be considered.
- Programming experience in scientific computing language (R, Python, Julia) and the ability to interact with relational data (SQL, Apache Pig, SparkSQL). General purpose programming (Python, Scala, Java) and familiarity with Hadoop is a plus.
- Excellent verbal and written communication skills.
- Experience with TV or digital audience measurement or market research data is a plus.
- Familiarity with systems analysis or systems thinking is a plus.
- Must be comfortable with analyzing complex, high-volume and high-dimension data from varying sources
- Excellent verbal, written and computer communication skills
- Ability to engage with Senior Leaders across all functional departments
- Ability to take on new responsibilities and adapt to changes
We are establishing infrastructure for internal and external reporting using Tableau and are looking for someone with experience building visualizations and dashboards in Tableau and using Tableau Server to deliver them to internal and external users.
Required Experience
- Implementation of interactive visualizations using Tableau Desktop
- Integration with Tableau Server and support of production dashboards and embedded reports with it
- Writing and optimization of SQL queries
- Proficient in Python including the use of Pandas and numpy libraries to perform data exploration and analysis
- 3 years of experience working as a Software Engineer / Senior Software Engineer
- Bachelors in Engineering – can be Electronic and comm , Computer , IT
- Well versed with Basic Data Structures Algorithms and system design
- Should be capable of working well in a team – and should possess very good communication skills
- Self-motivated and fun to work with and organized
- Productive and efficient working remotely
- Test driven mindset with a knack for finding issues and problems at earlier stages of development
- Interest in learning and picking up a wide range of cutting edge technologies
- Should be curious and interested in learning some Data science related concepts and domain knowledge
- Work alongside other engineers on the team to elevate technology and consistently apply best practices
Highly Desirable
- Data Analytics
- Experience in AWS cloud or any cloud technologies
- Experience in BigData technologies and streaming like – pyspark, kafka is a big plus
- Shell scripting
- Preferred tech stack – Python, Rest API, Microservices, Flask/Fast API, pandas, numpy, linux, shell scripting, Airflow, pyspark
- Has a strong backend experience – and worked with Microservices and Rest API’s - Flask, FastAPI, Databases Relational and Non-relational
Should be able to use the transformations components to transform the data
Should possess knowledge on incremental load, full load etc.
Should Design, build and deploy effective packages
Should be able to schedule these packages through task schedulers
Implement stored procedures and effectively query a database
Translate requirements from the business and analyst into technical code
Identify and test for bugs and bottlenecks in the ETL solution
Ensure the best possible performance and quality in the packages
Provide support and fix issues in the packages
Writes advanced SQL including some query tuning
Experience in the identification of data quality
Some database design experience is helpful
Experience designing and building complete ETL/SSIS processes moving and transforming data for
ODS, Staging, and Data Warehousing
About Us |
|
upGrad is an online education platform building the careers of tomorrow by offering the most industry-relevant programs in an immersive learning experience. Our mission is to create a new digital-first learning experience to deliver tangible career impact to individuals at scale. upGrad currently offers programs in Data Science, Machine Learning, Product Management, Digital Marketing, and Entrepreneurship, etc. upGrad is looking for people passionate about management and education to help design learning programs for working professionals to stay sharp and stay relevant and help build the careers of tomorrow.
|
Our client is an innovative Fintech company that is revolutionizing the business of short term finance. The company is an online lending startup that is driven by an app-enabled technology platform to solve the funding challenges of SMEs by offering quick-turnaround, paperless business loans without collateral. It counts over 2 million small businesses across 18 cities and towns as its customers. Its founders are IIT and ISB alumni with deep experience in the fin-tech industry, from earlier working with organizations like Axis Bank, Aditya Birla Group, Fractal Analytics, and Housing.com. It has raised funds of Rs. 100 Crore from finance industry stalwarts and is growing by leaps and bounds.
- Ensuring ease of data availability, with relevant dimensions, using Business Intelligence tools.
- Providing strong reporting and analytical information support to the management team.
- Transforming raw data into essential metrics basis needs of relevant stakeholders.
- Performing data analysis for generating reports on a periodic basis.
- Converting essential data into easy to reference visuals using Data Visualization tools (PowerBI, Metabase).
- Providing recommendations to update current MIS to improve reporting efficiency and consistency.
- Bringing fresh ideas to the table and keen observers of trends in the analytics and financial services industry.
What you need to have:
- MBA/ BE/ Graduate, with work experience of 3+ years.
- B.Tech /B.E.; MBA / PGDM
- Experience in Reporting, Data Management (SQL, MongoDB), Visualization (PowerBI, Metabase, Data studio)
- Work experience (into financial services, Indian Banks/ NBFCs in-house analytics units or Fintech/ analytics start-ups would be a plus.)
- Skilled at writing & optimizing large complicated SQL queries & MongoDB scripts.
- Strong knowledge of Banking/ Financial Services domain
- Experience with some of the modern relational databases
- Ability to work on multiple projects of different nature and self- driven,
- Liaise with cross-functional teams to resolve data issues and build strong reports
- Data Steward :
Data Steward will collaborate and work closely within the group software engineering and business division. Data Steward has overall accountability for the group's / Divisions overall data and reporting posture by responsibly managing data assets, data lineage, and data access, supporting sound data analysis. This role requires focus on data strategy, execution, and support for projects, programs, application enhancements, and production data fixes. Makes well-thought-out decisions on complex or ambiguous data issues and establishes the data stewardship and information management strategy and direction for the group. Effectively communicates to individuals at various levels of the technical and business communities. This individual will become part of the corporate Data Quality and Data management/entity resolution team supporting various systems across the board.
Primary Responsibilities:
- Responsible for data quality and data accuracy across all group/division delivery initiatives.
- Responsible for data analysis, data profiling, data modeling, and data mapping capabilities.
- Responsible for reviewing and governing data queries and DML.
- Accountable for the assessment, delivery, quality, accuracy, and tracking of any production data fixes.
- Accountable for the performance, quality, and alignment to requirements for all data query design and development.
- Responsible for defining standards and best practices for data analysis, modeling, and queries.
- Responsible for understanding end-to-end data flows and identifying data dependencies in support of delivery, release, and change management.
- Responsible for the development and maintenance of an enterprise data dictionary that is aligned to data assets and the business glossary for the group responsible for the definition and maintenance of the group's data landscape including overlays with the technology landscape, end-to-end data flow/transformations, and data lineage.
- Responsible for rationalizing the group's reporting posture through the definition and maintenance of a reporting strategy and roadmap.
- Partners with the data governance team to ensure data solutions adhere to the organization’s data principles and guidelines.
- Owns group's data assets including reports, data warehouse, etc.
- Understand customer business use cases and be able to translate them to technical specifications and vision on how to implement a solution.
- Accountable for defining the performance tuning needs for all group data assets and managing the implementation of those requirements within the context of group initiatives as well as steady-state production.
- Partners with others in test data management and masking strategies and the creation of a reusable test data repository.
- Responsible for solving data-related issues and communicating resolutions with other solution domains.
- Actively and consistently support all efforts to simplify and enhance the Clinical Trial Predication use cases.
- Apply knowledge in analytic and statistical algorithms to help customers explore methods to improve their business.
- Contribute toward analytical research projects through all stages including concept formulation, determination of appropriate statistical methodology, data manipulation, research evaluation, and final research report.
- Visualize and report data findings creatively in a variety of visual formats that appropriately provide insight to the stakeholders.
- Achieve defined project goals within customer deadlines; proactively communicate status and escalate issues as needed.
Additional Responsibilities:
- Strong understanding of the Software Development Life Cycle (SDLC) with Agile Methodologies
- Knowledge and understanding of industry-standard/best practices requirements gathering methodologies.
- Knowledge and understanding of Information Technology systems and software development.
- Experience with data modeling and test data management tools.
- Experience in the data integration project • Good problem solving & decision-making skills.
- Good communication skills within the team, site, and with the customer
Knowledge, Skills and Abilities
- Technical expertise in data architecture principles and design aspects of various DBMS and reporting concepts.
- Solid understanding of key DBMS platforms like SQL Server, Azure SQL
- Results-oriented, diligent, and works with a sense of urgency. Assertive, responsible for his/her own work (self-directed), have a strong affinity for defining work in deliverables, and be willing to commit to deadlines.
- Experience in MDM tools like MS DQ, SAS DM Studio, Tamr, Profisee, Reltio etc.
- Experience in Report and Dashboard development
- Statistical and Machine Learning models
- Python (sklearn, numpy, pandas, genism)
- Nice to Have:
- 1yr of ETL experience
- Natural Language Processing
- Neural networks and Deep learning
- xperience in keras,tensorflow,spacy, nltk, LightGBM python library
Interaction : Frequently interacts with subordinate supervisors.
Education : Bachelor’s degree, preferably in Computer Science, B.E or other quantitative field related to the area of assignment. Professional certification related to the area of assignment may be required
Experience : 7 years of Pharmaceutical /Biotech/life sciences experience, 5 years of Clinical Trials experience and knowledge, Excellent Documentation, Communication, and Presentation Skills including PowerPoint