11+ Dataflow architecture Jobs in India
Apply to 11+ Dataflow architecture Jobs on CutShort.io. Find your next job, effortlessly. Browse Dataflow architecture Jobs and apply today!
MNC Pune based IT company
CANDIDATE WILL BE DEPLOYED IN A FINANCIAL CAPTIVE ORGANIZATION @ PUNE (KHARADI)
Below are the job Details :-
Experience 10 to 18 years
Mandatory skills –
- data migration,
- data flow
The ideal candidate for this role will have the below experience and qualifications:
- Experience of building a range of Services in a Cloud Service provider (ideally GCP)
- Hands-on design and development of Google Cloud Platform (GCP), across a wide range of GCP services including hands on experience of GCP storage & database technologies.
- Hands-on experience in architecting, designing or implementing solutions on GCP, K8s, and other Google technologies. Security and Compliance, e.g. IAM and cloud compliance/auditing/monitoring tools
- Desired Skills within the GCP stack - Cloud Run, GKE, Serverless, Cloud Functions, Vision API, DLP, Data Flow, Data Fusion
- Prior experience of migrating on-prem applications to cloud environments. Knowledge and hands on experience on Stackdriver, pub-sub, VPC, Subnets, route tables, Load balancers, firewalls both for on premise and the GCP.
- Integrate, configure, deploy and manage centrally provided common cloud services (e.g. IAM, networking, logging, Operating systems, Containers.)
- Manage SDN in GCP Knowledge and experience of DevOps technologies around Continuous Integration & Delivery in GCP using Jenkins.
- Hands on experience of Terraform, Kubernetes, Docker, Stackdriver, Terraform
- Programming experience in one or more of the following languages: Python, Ruby, Java, JavaScript, Go, Groovy, Scala
- Knowledge or experience in DevOps tooling such as Jenkins, Git, Ansible, Splunk, Jira or Confluence, AppD, Docker, Kubernetes
- Act as a consultant and subject matter expert for internal teams to resolve technical deployment obstacles, improve product's vision. Ensure compliance with centrally defined Security
- Financial experience is preferred
- Ability to learn new technologies and rapidly prototype newer concepts
- Top-down thinker, excellent communicator, and great problem solver
Exp:- 10 to 18 years
Location:- Pune
Candidate must have experience in below.
- GCP Data Platform
- Data Processing:- Data Flow, Data Prep, Data Fusion
- Data Storage:- Big Query, Cloud Sql,
- Pub Sub, GCS Bucket
● Able contribute to the gathering of functional requirements, developing technical
specifications, and project & test planning
● Demonstrating technical expertise, and solving challenging programming and design
problems
● Roughly 80% hands-on coding
● Generate technical documentation and PowerPoint presentations to communicate
architectural and design options, and educate development teams and business users
● Resolve defects/bugs during QA testing, pre-production, production, and post-release
patches
● Work cross-functionally with various bidgely teams including: product management,
QA/QE, various product lines, and/or business units to drive forward results
Requirements
● BS/MS in computer science or equivalent work experience
● 2-4 years’ experience designing and developing applications in Data Engineering
● Hands-on experience with Big data Eco Systems.
● Hadoop,Hdfs,Map Reduce,YARN,AWS Cloud, EMR, S3, Spark, Cassandra, Kafka,
Zookeeper
● Expertise with any of the following Object-Oriented Languages (OOD): Java/J2EE,Scala,
Python
● Strong leadership experience: Leading meetings, presenting if required
● Excellent communication skills: Demonstrated ability to explain complex technical
issues to both technical and non-technical audiences
● Expertise in the Software design/architecture process
● Expertise with unit testing & Test-Driven Development (TDD)
● Experience on Cloud or AWS is preferable
● Have a good understanding and ability to develop software, prototypes, or proofs of
concepts (POC's) for various Data Engineering requirements.
Top Management Consulting Company
We are looking out for a technically driven "ML OPS Engineer" for one of our premium client
COMPANY DESCRIPTION:
Key Skills
• Excellent hands-on expert knowledge of cloud platform infrastructure and administration
(Azure/AWS/GCP) with strong knowledge of cloud services integration, and cloud security
• Expertise setting up CI/CD processes, building and maintaining secure DevOps pipelines with at
least 2 major DevOps stacks (e.g., Azure DevOps, Gitlab, Argo)
• Experience with modern development methods and tooling: Containers (e.g., docker) and
container orchestration (K8s), CI/CD tools (e.g., Circle CI, Jenkins, GitHub actions, Azure
DevOps), version control (Git, GitHub, GitLab), orchestration/DAGs tools (e.g., Argo, Airflow,
Kubeflow)
• Hands-on coding skills Python 3 (e.g., API including automated testing frameworks and libraries
(e.g., pytest) and Infrastructure as Code (e.g., Terraform) and Kubernetes artifacts (e.g.,
deployments, operators, helm charts)
• Experience setting up at least one contemporary MLOps tooling (e.g., experiment tracking,
model governance, packaging, deployment, feature store)
• Practical knowledge delivering and maintaining production software such as APIs and cloud
infrastructure
• Knowledge of SQL (intermediate level or more preferred) and familiarity working with at least
one common RDBMS (MySQL, Postgres, SQL Server, Oracle)
- KSQL
- Data Engineering spectrum (Java/Spark)
- Spark Scala / Kafka Streaming
- Confluent Kafka components
- Basic understanding of Hadoop
Job ID: RP100
Work Location: Remote
Required Experience: 4 to 7 years
Job Description
- Must have Google Cloud Big Query experience
- Strong experience with data analysis, data modeling and governance, with excellent analytical and problem-solving abilities
- Good knowledge of Data Warehouses, data flow ETL pipelines
- Design, configuration/administration of database software in Cloud platform.
- Monitoring, Troubleshooting, and Performance tuning the DB objects.
- Experience on Table Partition, Clustered Table, Materialized View, External Tables etc.
Anyone RDBMS technologies
- Good experience in DB design with knowledge of ER Diagram, PK/FK, Stored procedure, Function, Triggers, and Indexes.
- Understanding the requirement of the App team and creating the necessary DB objects by following the best practices.
- Managing logins and database users, as well as database roles, application roles, and other security principles within the database.
- Deep knowledge about Indexes, Performance tuning, and Complex SQL Query patterns.
- Monitoring, Tuning, and Troubleshooting the database-related issues.
About Us:
Mobile programming LLC is a US-based digital transformation company. We help enterprises transform ideas into innovative and intelligent solutions, governing the Internet of Things, Digital Commerce, Business Intelligence Analytics, and Cloud Programming. Bring your challenges to us, we will give you the smartest solutions. From conceptualizing and engineering to advanced manufacturing, we help customers build and scale products fit for the global marketplace.
Mobile programming LLC has offices located in Los Angeles, San Jose, Glendale, San Diego, Phoenix, Plano, New York, Fort Lauderdale, and Boston. Mobile programming is SAP Preferred Vendor, Apple Adjunct Partner, Google Empaneled Mobile Vendor, and Microsoft Gold Certified Partner.
- Expert software implementation and automated testing
- Promoting development standards, code reviews, mentoring, knowledge sharing
- Improving our Agile methodology maturity
- Product and feature design, scrum story writing
- Build, release, and deployment automation
- Product support & troubleshooting
Who we have in mind:
- Demonstrated experience as a Java
- Should have a deep understanding of Enterprise/Distributed Architecture patterns and should be able to demonstrate the relevant usage of the same
- Turn high-level project requirements into application-level architecture and collaborate with the team members to implement the solution
- Strong experience and knowledge in Spring boot framework and microservice architecture
- Experience in working with Apache Spark
- Solid demonstrated object-oriented software development experience with Java, SQL, Maven, relational/NoSQL databases and testing frameworks
- Strong working experience with developing RESTful services
- Should have experience working on Application frameworks such as Spring, Spring Boot, AOP
- Exposure to tools – Jira, Bamboo, Git, Confluence would be an added advantage
- Excellent grasp of the current technology landscape, trends and emerging technologies
Required Experience: 5 - 7 Years
Skills : ADF, Azure, SSIS, python
Job Description
Azure Data Engineer with hands on SSIS migrations and ADF expertise.
Roles & Responsibilities
•Overall, 6+ years’ experience in Cloud Data Engineering, with hands on experience in ADF (Azure Data Factory) is required.
Hands on experience with SSIS to ADF migration is preferred.
SQL Server Integration Services (SSIS) workloads to SSIS in ADF. ( Must have done at least one migration)
Hands on experience implementing Azure Data Factory frameworks, scheduling, and performance tuning.
Hands on experience in migrating SSIS solutions to ADF
Hands on experience in ADF coding side.
Hands on experience with MPP Database architecture
Hands on experience in python
Roles and responsibilities:
- Responsible for development and maintenance of applications with technologies involving Enterprise Java and Distributed technologies.
- Experience in Hadoop, Kafka, Spark, Elastic Search, SQL, Kibana, Python, experience w/ machine learning and Analytics etc.
- Collaborate with developers, product manager, business analysts and business users in conceptualizing, estimating and developing new software applications and enhancements..
- Collaborate with QA team to define test cases, metrics, and resolve questions about test results.
- Assist in the design and implementation process for new products, research and create POC for possible solutions.
- Develop components based on business and/or application requirements
- Create unit tests in accordance with team policies & procedures
- Advise, and mentor team members in specialized technical areas as well as fulfill administrative duties as defined by support process
- Work with cross-functional teams during crisis to address and resolve complex incidents and problems in addition to assessment, analysis, and resolution of cross-functional issues.