11+ Informatica PowerCenter Jobs in Mumbai | Informatica PowerCenter Job openings in Mumbai
Apply to 11+ Informatica PowerCenter Jobs in Mumbai on CutShort.io. Explore the latest Informatica PowerCenter Job opportunities across top companies like Google, Amazon & Adobe.
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake
Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
Experience:8+ Years
AWS Certification must.
Location:Pan india
What you will need:
Leading Payment Solution Company
About Company:
The company is a global leader in secure payments and trusted transactions. They are at the forefront of the digital revolution that is shaping new ways of paying, living, doing business and building relationships that pass on trust along the entire payments value chain, enabling sustainable economic growth. Their innovative solutions, rooted in a rock-solid technological base, are environmentally friendly, widely accessible and support social transformation.
- Senior Engineer with a strong background and experience in cloud related technologies and architectures.
- Can design target cloud architectures to transform existing architectures together with the in-house team.
- Can actively hands-on configure and build cloud architectures and guide others.
Key Knowledge
- 3-5+ years of experience in AWS/GCP or Azure technologies
- Is likely certified on one or more of the major cloud platforms
- Strong experience from hands-on work with technologies such as Terraform, K8S, Docker and orchestration of containers.
- Ability to guide and lead internal agile teams on cloud technology
- Background from the financial services industry or similar critical operational experience
A leading IT based company @ Navi Mumbai
Hello,
Greetings for the day !!!
Tridat Technologies is hiring "L1 Windows Server Administrator" for one of the advanced technology solutions company catering to the needs of the Banking, Mobility, Payments and Government sectors.
Qualifications: Any graduate
Experience: 2+ yrs
Roles & Responsibilities:
• Windows /Linux OS Administration ( Certification will be added value).
• Trouble shooting knowledge
• AD knowledge.
• Cloud (AWS / Azure /GCP) knowledge ( Certification will be added value).
• Good communication skill
• Team work
• Remote management idea
• 24x7 Support
Experience in Virtualisation (Vmware & hYperv)
• Tickets and ITSM process idea.
Location: Rabale, Navi Mumbai
Working Timing: 24*7 rotational shifts
Employment Mode: Contract to hire (Full time opportunity)
Joining Period: Immediate to max 15 days
Thank You & Regards,
Shraddha Kamble
HR Recruiter
BASIC QUALIFICATIONS
- BS level technical degree or equivalent professional
- Skills to represent AWS ProServe well within the customer’s environment and drive discussions with senior personnel regarding trade-offs, best practices, project management and risk mitigation
- Demonstrated ability to think strategically about business, product, and technical challenges
- ·Highly technical and analytical, possessing 15 or more years of IT implementation experience
- ·Proven experience with software development life cycle (SDLC) and agile/iterative methodologies required
- Hands on experience in infrastructure implementation
- Automation skills python, powershell knowledgeand cloud formation template
- Strong experience in Architecture, designing and migrating applications on the AWS platform
- Experience with IT compliance and risk management requirements (eg. security, privacy, SOX, HIPPA etc.)
- Experience working with customers, partners or third-party developers
- Strong verbal and written communications skills and ability to lead effectively across organizations
PREFERRED QUALIFICATIONS
o Demonstrated success as a Cloud Infrastructure architect or consultant working with various platforms
o Experience architecting highly available systems that utilize load balancing, horizontal scalability and high availability
o Cloud certifications
o Good exposure to Agile software development and DevOps practices such as Infrastructure as Code (IaC), Continuous Integration and automated deployment
o Presenting at public events such as technology conferences, hackathons, blogging, writing on technical forums, etc.
is an agile and innovative, global analytics company driven
Job Description – Developer (ETL + Database)
Develop, document & Support ETL mappings, Database structures & BI reports.
Perform unit testing of developments done by him/her.
Participate in UAT process and ensure quick resolution of any UAT issue.
Manage different environments and be responsible for proper deployment of code in all client
environments.
Prepare release documents.
Prepare and Maintain project documents as advised by Team Leads.
Skill-sets:
3+ years of Hands on experience on ETL Pentaho Spoon Talend & MS SQL Server, Oracle & SYBASE Database tools.
Ability to write complex SQL and database procedures.
Good knowledge and understanding regarding Data warehouse Concepts, ETL Concepts, ETL
Loading Strategies, Data archiving, Data Reconciliation, ETL error handling etc.
Problem Solving.
Good communication skills – written and verbal.
Self-motivated, team player, action and result oriented.
Ability to successfully work under tight project schedule
at Magic9 Media and Consumer Knowledge Pvt. Ltd.
Job Description
This requirement is to service our client which is a leading big data technology company that measures what viewers consume across platforms to enable marketers make better advertising decisions. We are seeking a Senior Data Operations Analyst to mine large-scale datasets for our client. Their work will have a direct impact on driving business strategies for prominent industry leaders. Self-motivation and strong communication skills are both must-haves. Ability to work in a fast-paced work environment is desired.
Problems being solved by our client:
Measure consumer usage of devices linked to the internet and home networks including computers, mobile phones, tablets, streaming sticks, smart TVs, thermostats and other appliances. There are more screens and other connected devices in homes than ever before, yet there have been major gaps in understanding how consumers interact with this technology. Our client uses a measurement technology to unravel dynamics of consumers’ interactions with multiple devices.
Duties and responsibilities:
- The successful candidate will contribute to the development of novel audience measurement and demographic inference solutions.
- Develop, implement, and support statistical or machine learning methodologies and processes.
- Build, test new features and concepts and integrate into production process
- Participate in ongoing research and evaluation of new technologies
- Exercise your experience in the development lifecycle through analysis, design, development, testing and deployment of this system
- Collaborate with teams in Software Engineering, Operations, and Product Management to deliver timely and quality data. You will be the knowledge expert, delivering quality data to our clients
Qualifications:
- 3-5 years relevant work experience in areas as outlined below
- Experience in extracting data using SQL from large databases
- Experience in writing complex ETL processes and frameworks for analytics and data management. Must have experience in working on ETL tools.
- Master’s degree or PhD in Statistics, Data Science, Economics, Operations Research, Computer Science, or a similar degree with a focus on statistical methods. A Bachelor’s degree in the same fields with significant, demonstrated professional research experience will also be considered.
- Programming experience in scientific computing language (R, Python, Julia) and the ability to interact with relational data (SQL, Apache Pig, SparkSQL). General purpose programming (Python, Scala, Java) and familiarity with Hadoop is a plus.
- Excellent verbal and written communication skills.
- Experience with TV or digital audience measurement or market research data is a plus.
- Familiarity with systems analysis or systems thinking is a plus.
- Must be comfortable with analyzing complex, high-volume and high-dimension data from varying sources
- Excellent verbal, written and computer communication skills
- Ability to engage with Senior Leaders across all functional departments
- Ability to take on new responsibilities and adapt to changes
This company provides on-demand cloud computing platforms.
- 15+ years of Hands-on technical application architecture experience and Application build/ modernization experience
- 15+ years of experience as a technical specialist in Customer-facing roles.
- Ability to travel to client locations as needed (25-50%)
- Extensive experience architecting, designing and programming applications in an AWS Cloud environment
- Experience with designing and building applications using AWS services such as EC2, AWS Elastic Beanstalk, AWS OpsWorks
- Experience architecting highly available systems that utilize load balancing, horizontal scalability and high availability
- Hands-on programming skills in any of the following: Python, Java, Node.js, Ruby, .NET or Scala
- Agile software development expert
- Experience with continuous integration tools (e.g. Jenkins)
- Hands-on familiarity with CloudFormation
- Experience with configuration management platforms (e.g. Chef, Puppet, Salt, or Ansible)
- Strong scripting skills (e.g. Powershell, Python, Bash, Ruby, Perl, etc.)
- Strong practical application development experience on Linux and Windows-based systems
- Extra curricula software development passion (e.g. active open source contributor)
● Good experience with Continuous integration and deployment tools like
Jenkins, Spinnaker, etc.
● Ability to understand problems and craft maintainable solutions.
● Working cross-functionally with a broad set of business partners to understand
and integrate their API or data flow systems with Xeno, so a minimal
understanding of data and API integration is a must.
● Experience with docker and microservice based architecture using
orchestration platforms like Kubernetes.
● Understanding of Public Cloud, We use Azure and Google Cloud.
● Familiarity with web servers like Apache, nginx, etc.
● Possessing knowledge of monitoring tools such as Prometheus, Grafana, New
Relic, etc.
● Scripting in languages like Python, Golang, etc is required.
● Some knowledge of database technologies like MYSQL and Postgres is
required.
● Understanding Linux, specifically Ubuntu.
● Bonus points for knowledge and best practices related to security.
● Knowledge of Java or NodeJS would be a significant advantage.
Initially, when you join some of the projects you’d get to own are:
● Audit and improve overall security of the Infrastructure.
● Setting up different environments for different sets of teams like
QA,Development, Business etc.
Wewill disclose the client name after theinitial screening.
Job title: Azure Architect
Locations: Noida, Pune, Bangalore and Mumbai
Responsibilities:
- Develop and maintain scalable architecture, database design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity
- Design and Develop the Data lake, Data warehouse using Azure Cloud Services
- Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure
- Drive the design, sizing, POC setup, etc. of Azure environments and related services for the use cases and the solutions
- Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology.
- Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams
- Support internal presentations to technical and business teams
- Provide technical guidance, mentoring and code review, design level technical best practices
Experience Needed:
- 12-15 years of industry experience and at least 3 years of experience in architect role is required along with at least 3 to 4 years’ experience designing and building analytics solutions in Azure.
- Experience in architecting data ingestion/integration frameworks capable of processing structured, semi-structured & unstructured data sets in batch & real-time
- Hands-on experience in the design of reporting schemas, data marts and development of reporting solutions
- Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data
- Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and Azure Analysis Services and other ETL technologies.
- Experience in Perform Design, Development & Deployment using Azure Services ( Azure Synapse, Data Factory, Azure Data Lake Storage, Databricks, Python and SSIS)
- Worked with transactional, temporal, time series, and structured and unstructured data.
- Deep understanding of the operational dependencies of applications, networks, systems, security, and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux).
Mandatory Skills: Azure Synapse, Data Factory, Azure Data Lake Storage, Azure DW, Databricks, Python