4+ NFS Jobs in Chennai | NFS Job openings in Chennai
Apply to 4+ NFS Jobs in Chennai on CutShort.io. Explore the latest NFS Job opportunities across top companies like Google, Amazon & Adobe.
IT Software Company in Chennai
Roles & Responsibilities
Selected engineer's day-to-day responsibilities include:
• Maintaining UNIX user accounts and access managements,
• Good understanding of LDAP vs local IAM management.
• Setting up repositories and maintaining.
• Should be very comfortable with troubleshooting Linux-based systems on issues and failures with good grasp of the Linux command line
• Specialist knowledge in Unix Technologies – RedHat , SUSE Linux & SOLARIS.
• Good hands-on experience in parallel filesystems.
• Good understanding on basic networking technology e.g. TCPIP, ip multipathing, ip aggregation, DNS, DHCP etc.
• Comfortable exploring and testing new Open Source software systems should the need arise to solve particular problems
• Setting up and maintaining SUSE and Red hat Clusters with DRBD, Stonith and Coro sync
• Good knowledge in virtualisation/hypervisor technologies on various operating platforms e.g. VMWare, HPE Service Guard cluster
• Good understanding of the various enterprise technologies e.g. SAN, enterprise backup, enterprise monitoring etc
. • Good knowledge in handling NFS file systems and NFS servers
• Troubleshooting performance issues
• Shell, Power shell and python scripting
• Creating custom image based on the security requirements • Planning and executing security patches periodically
Education, Experience & other requirements
Ø Degree in Computer science, Information technology or any related discipline
Ø Minimum 4 - 5 years of experience into Unix server’s setup, administration, upgrading and support.
Ø Good verbal and written communication and able to manage key stakeholders i.e. project team, security, compliance and IT operations
Ø Strong analytical and troubleshooting skills
at Delivery Solutions
Job Description: We are seeking an experienced Kafka Platform Administrator to join our team. As a Kafka expert, you will be responsible for managing and optimizing our Confluent Kafka platform across on-premises and cloud environments. Your expertise in Kafka operations, performance tuning, and ecosystem components will be crucial in maintaining a robust and efficient messaging system.
5- 7 years of Confluent Kafka platform experience
Responsibilities:
- Administration of Confluent Kafka Platform:Manage Kafka clusters, brokers, and topics both on-premises and in the cloud.
- Monitor system health, troubleshoot issues, and ensure high availability.
- Configure and fine-tune Kafka parameters for optimal performance.
- Implement security measures and access controls.
- Collaborate with development teams to ensure seamless integration with Kafka.
- Kafka Operations:Understand Kafka internals and operational best practices.
- Handle topics, partitions, and consumer groups efficiently.
- Execute kSQL queries for data processing and analysis.
- Maintain schema compatibility using the Schema Registry.
- Kafka Ecosystem Knowledge:Familiarity with Kafka Brokers, Zookeeper/KRaft, kSQL, Connectors, Schema Registry, and Control Center.
- Ensure platform interoperability with other systems.
- Kafka Cluster Linking and Replication:Configure cross-cluster replication for disaster recovery and data synchronization.
- Manage multi-regional Kafka clusters.
- System Performance Tuning:Optimize Kafka performance by adjusting configurations, monitoring resource usage, and identifying bottlenecks.
- Collaborate with application teams to meet performance requirements.
- Operating Systems:Proficiency in RedHat Linux administration.
JOB DESCRIPTION AND RESPONSIBILITIES:
- Work as a Member of The File Transfer Engineering team involved with the Configuration, Maintenance, and Support of Sterling File Gateway, Connect Direct (NDM) environment.
- This position will participate in the design & development of technical solutions within company strategic B2B/MFT platform which provide innovative technical improvements, improve quality and operational excellence, lower cost, satisfy regulatory and compliance requirements, mitigate risk, and satisfy business objectives.
QUALIFICATIONS:
- Bachelor’s degree in computer engineering/Science Degree or related field
- 4-5 years’ experience in supporting real time production environments.
- Extensive debugging skills and fixing experience
- General knowledge of industry standards and best practices for supporting middleware technologies
- Knowledge of Ping Access, Ping Federate, LDAP, Active Directory, Connect Direct (NDM) Unix/Windows/Mainframe
- Developing transformation maps and processes for any-to-any data formats and RDBMS queries using IBM Sterling B2B Integrator mapping and business process modeling tools.
- This position requires sophisticated knowledge of B2B/MFT concepts, technologies, best practices, standards, and architectures within the IBM File Gateway suite.
- Minimum five years of technical experience in IBM File Gateway Suite (IBM Sterling B2B Integrator (preferably version 6.x) and IBM Sterling File Gateway, Sterling External Authentication Server, Secure Proxy, IBM Control Center Monitor) including:
- Experience with Installation and configuration of multi-node architecture environments, including upgrades and platform migration.
- Experience in setting up and testing disaster recovery environment to handle failover scenarios.
- Experience with onboarding new partners, transactions and making changes to existing partners to meet internal and external requirements.
- Experience in configuring adapters, service configuration and policies.
- Experience in setting up and configuring industry-standard communication protocols such as SFTP, FTPs, FTP, HTTP, HTTP, Connect: Direct, WebSphere MQ
- Experience with security configurations SSL certificates, SSH keys, PGP/GPG
- Experience with IBM File Gateway Suite maintenance and administration
- Experience with external authentication integration
- Experience with Java extensibility
- Development of custom protocols, custom file layers and custom consumer identification
- Must possess strong analytical, problem-solving, and root cause analysis skills.
- Hands-on experience in Production Support, resolving critical incidents using the IBM File Gateway Suite
- Knowledge of SQL queries, database schemas and related concepts using Oracle
- Knowledge of IBM Sterling File Gateway APIs.
- Should have Excellent communication and customer service skills.
This role is for Work from the office.
Job Description
Roles & Responsibilities
- Work across the entire landscape that spans network, compute, storage, databases, applications, and business domain
- Use the Big Data and AI-driven features of vuSmartMaps to provide solutions that will enable customers to improve the end-user experience for their applications
- Create detailed designs, solutions and validate with internal engineering and customer teams, and establish a good network of relationships with customers and experts
- Understand the application architecture and transaction-level workflow to identify touchpoints and metrics to be monitored and analyzed
- Analytics and analysis of data and provide insights and recommendations
- Constantly stay ahead in communicating with customers. Manage planning and execution of platform implementation at customer sites.
- Work with the product team in developing new features, identifying solution gaps, etc.
- Interest and aptitude in learning new technologies - Big Data, no SQL databases, Elastic Search, Mongo DB, DevOps.
Skills & Experience
- At least 2+ years of experience in IT Infrastructure Management
- Experience in working with large-scale IT infra, including applications, databases, and networks.
- Experience in working with monitoring tools, automation tools
- Hands-on experience in Linux and scripting.
- Knowledge/Experience in the following technologies will be an added plus: ElasticSearch, Kafka, Docker Containers, MongoDB, Big Data, SQL databases, ELK stack, REST APIs, web services, and JMX.