11+ JasperSoft Jobs in India
Apply to 11+ JasperSoft Jobs on CutShort.io. Find your next job, effortlessly. Browse JasperSoft Jobs and apply today!
Bengaluru (Bangalore)
5 - 10 yrs
₹4L - ₹18L / yr
Jasper
JasperReports
ETL
JasperSoft
OLAP
+3 more
Job Description - Jasper
- Knowledge of Jasper report server administration, installation and configuration
- Knowledge of report deployment and configuration
- Knowledge of Jaspersoft Architecture and Deployment
- Knowledge of User Management in Jaspersoft Server
- Experience in developing Complex Reports using Jaspersoft Server and Jaspersoft Studio
- Understand the Overall architecture of Jaspersoft BI
- Experience in creating Ad Hoc Reports, OLAP, Views, Domains
- Experience in report server (Jaspersoft) integration with web application
- Experience in JasperReports Server web services API and Jaspersoft Visualise.JS Web service API
- Experience in creating dashboards with visualizations
- Experience in security and auditing, metadata layer
- Experience in Interacting with stakeholders for requirement gathering and Analysis
- Good knowledge ETL design and development, logical and physical data modeling (relational and dimensional)
- Strong self- initiative to strive for both personal & technical excellence.
- Coordinate efforts across Product development team and Business Analyst team.
- Strong business and data analysis skills.
- Domain knowledge of Healthcare an advantage.
- Should be strong on Co- ordinate with onshore resources on development.
- Data oriented professional with good communications skills and should have a great eye for detail.
- Interpret data, analyze results and provide insightful inferences
- Maintain relationship with Business Intelligence stakeholders
- Strong Analytical and Problem Solving skills
Read more
Pune
15 - 20 yrs
₹10L - ₹15L / yr
Technical Writing
Documentation
API
Software documentation
Job Title : Technical Writer
CTC : 40-43 LPA
Experience : 15 yrs
Location : Pune
Working Mode : Onsite
Primary skills : Technical writing, API Documentation, Big Data, Logistic, Banking
Key Responsibilities:
- Create and maintain high-quality technical documentation across various domains, including big data, data management, logistics, and banking.
- Develop user-friendly guides, API documentation, and online help resources that effectively convey complex technical concepts to a diverse audience.
- Innovate and implement new documentation strategies to make our documentation more customer-centric.
- Collaborate with cross-functional teams to ensure consistency and accuracy across documentation suites.
- Utilize scriptwriting, video production, and editing skills to produce engaging and informative visual content that enhances the user experience.
- Mentor and train colleagues in effective documentation techniques and best practices.
- Drive process improvements to reduce time-to-delivery and enhance the overall efficiency of documentation processes.
- Stay up-to-date with industry trends and technologies to ensure our documentation remains cutting-edge and relevant.
Qualifications:
- Bachelor’s degree in a relevant field or equivalent practical experience.
- Over 15 years of experience in the software documentation field, with a strong track record in big data, data management, logistics, and banking.
- Proficiency in creating user-friendly guides, API documentation, and online help resources.
- Demonstrated success in improving user engagement through documentation revamps.
- Strong scriptwriting, video production, and editing skills for visual content creation.
- Exceptional collaboration skills and the ability to simplify complex technical concepts for diverse audiences.
- A passion for sharing knowledge and insights through mentoring and training.
Read more
Pune
7 - 12 yrs
₹25L - ₹30L / yr
Snowflake
Snow flake schema
ETL
Data Warehouse (DWH)
Python
+8 more
Help us modernize our data platforms, with a specific focus on Snowflake
• Work with various stakeholders, understand requirements, and build solutions/data pipelines
that address the needs at scale
• Bring key workloads to the clients’ Snowflake environment using scalable, reusable data
ingestion and processing frameworks to transform a variety of datasets
• Apply best practices for Snowflake architecture, ELT and data models
Skills - 50% of below:
• A passion for all things data; understanding how to work with it at scale, and more importantly,
knowing how to get the most out of it
• Good understanding of native Snowflake capabilities like data ingestion, data sharing, zero-copy
cloning, tasks, Snowpipe etc
• Expertise in data modeling, with a good understanding of modeling approaches like Star
schema and/or Data Vault
• Experience in automating deployments
• Experience writing code in Python, Scala or Java or PHP
• Experience in ETL/ELT either via a code-first approach or using low-code tools like AWS Glue,
Appflow, Informatica, Talend, Matillion, Fivetran etc
• Experience in one or more of the AWS especially in relation to integration with Snowflake
• Familiarity with data visualization tools like Tableau or PowerBI or Domo or any similar tool
• Experience with Data Virtualization tools like Trino, Starburst, Denodo, Data Virtuality, Dremio
etc.
• Certified SnowPro Advanced: Data Engineer is a must.
• Work with various stakeholders, understand requirements, and build solutions/data pipelines
that address the needs at scale
• Bring key workloads to the clients’ Snowflake environment using scalable, reusable data
ingestion and processing frameworks to transform a variety of datasets
• Apply best practices for Snowflake architecture, ELT and data models
Skills - 50% of below:
• A passion for all things data; understanding how to work with it at scale, and more importantly,
knowing how to get the most out of it
• Good understanding of native Snowflake capabilities like data ingestion, data sharing, zero-copy
cloning, tasks, Snowpipe etc
• Expertise in data modeling, with a good understanding of modeling approaches like Star
schema and/or Data Vault
• Experience in automating deployments
• Experience writing code in Python, Scala or Java or PHP
• Experience in ETL/ELT either via a code-first approach or using low-code tools like AWS Glue,
Appflow, Informatica, Talend, Matillion, Fivetran etc
• Experience in one or more of the AWS especially in relation to integration with Snowflake
• Familiarity with data visualization tools like Tableau or PowerBI or Domo or any similar tool
• Experience with Data Virtualization tools like Trino, Starburst, Denodo, Data Virtuality, Dremio
etc.
• Certified SnowPro Advanced: Data Engineer is a must.
Read more
Bengaluru (Bangalore)
3 - 8 yrs
₹1L - ₹9L / yr
Business Intelligence (BI)
Microsoft Business Intelligence (MSBI)
BI developer
SQL Server Reporting Services (SSRS)
Crystal Reports
Primary Role
• As an SSRS Report Engineer, the primary job role is to develop a transactional data report using Microsoft SQL Service Reporting Service 2016 or above.
• This includes developing high performing Stored Procedures, Functions, and corresponding reporting formats.
• Required to design the required logics for the queries and collaboratively coordinate with the rest of the team.
Main Duties / Responsibilities
• Design & implement high performing T-SQL queries as per the functional specification.
• Create Technical Specification.
• Effective troubleshooting of technical issues and bugs.
• Package deployment and release management.
• Provides technical support to customers.
• Participate in Requirements discussions.
• Must be able to manage timelines and task priorities.
• Attention to detail and maintaining high-quality standards is a must.
TECHNICAL KNOWLEDGE
• Experience in Microsoft SQL Server 2016 (or above) and Microsoft SQL Service Reporting Service 2016 (or above).
• Knowledge of T-SQL including complex SQL queries, stored procedures, views and functions.
• Experience with usage of indexes and T-SQL performance tuning techniques.
QUALIFICATIONS
• Bachelor’s Degree or an equivalent qualification
Questions for Candidates
• Require experience in query optimizing.
• Have to be hands-on in SSRS.
• Good in T-Sql query, like use of temp tables when developing store procedures concerning report development
• Should be a guide and mentor to make sure to provide a quality outcome.
• Very good in basic development principles
o Adding Cascading parameters
o Grouping subtotal/ Grand total
o Sub Reports/ HyperLink creation
o Matrix Grid
• As an SSRS Report Engineer, the primary job role is to develop a transactional data report using Microsoft SQL Service Reporting Service 2016 or above.
• This includes developing high performing Stored Procedures, Functions, and corresponding reporting formats.
• Required to design the required logics for the queries and collaboratively coordinate with the rest of the team.
Main Duties / Responsibilities
• Design & implement high performing T-SQL queries as per the functional specification.
• Create Technical Specification.
• Effective troubleshooting of technical issues and bugs.
• Package deployment and release management.
• Provides technical support to customers.
• Participate in Requirements discussions.
• Must be able to manage timelines and task priorities.
• Attention to detail and maintaining high-quality standards is a must.
TECHNICAL KNOWLEDGE
• Experience in Microsoft SQL Server 2016 (or above) and Microsoft SQL Service Reporting Service 2016 (or above).
• Knowledge of T-SQL including complex SQL queries, stored procedures, views and functions.
• Experience with usage of indexes and T-SQL performance tuning techniques.
QUALIFICATIONS
• Bachelor’s Degree or an equivalent qualification
Questions for Candidates
• Require experience in query optimizing.
• Have to be hands-on in SSRS.
• Good in T-Sql query, like use of temp tables when developing store procedures concerning report development
• Should be a guide and mentor to make sure to provide a quality outcome.
• Very good in basic development principles
o Adding Cascading parameters
o Grouping subtotal/ Grand total
o Sub Reports/ HyperLink creation
o Matrix Grid
Read more
SUVI(provide a best hike on the current or offered ctc)
Agency job
via SUVI BUSINESS VENTURE by VINOTH KUMAR
Chennai
8 - 12 yrs
₹10L - ₹15L / yr
Java
Spring
Hibernate (Java)
Webservices
Java Server Faces (JSF)
+10 more
Hands on experience in Java, Spring, Hibernate, webservices, JSF, AJAX, XML, JavaScript, JQuery, , postgres / MySQL.
· At least 8 Years of Java development experience out of which person must have worked 4 years as Lead architect
· Hands on Experience on query optimization and performance tuning.
· Hands on Experience in design pattern and apply this patterns in module design
· Must have excellent knowledge of service oriented architecture.
· Good knowledge of cloud computing Deployment Models
· Experience with application servers like JBoss, Tomcat
· Excellent verbal, written, and presentation communications skills
· Must have good documentation and presentation skills
· Able to work well under pressure and time constraints
· Excellent problem solving skills and ability to think creatively under pressure.
Would be given preference if person possesses below skills:
· Prior experience in Product development especially product enhancement
Responsibilities: -
· Design a module using software best practices and explain to development team.
· Responsible for the development and enhancements of new & existing products.
· Will liaise with different members of the cross functions teams to drive projects forward.
· The candidate should be able to own and drive the Product Direction and manage PDLC that includes research and discovery, design, development, release management, as well as experimentation and validation
-We will provide a best hike on candidate's current ctc or offered ctc.....
we will provide upto 25LPA -30LPA
-We will provide a best hike on candidate's current ctc or offered ctc.....
we will provide upto 25LPA -30LPA
Read more
An US based firm offering permanent WFH
Agency job
via Jobdost by Mamatha A
Remote only
3 - 6 yrs
₹14L - ₹18L / yr
Django
NodeJS (Node.js)
GraphQL
MongoDB
SQL
+7 more
This person MUST have:
- BE Computer Science or equivalent
- Cloud app development experience.
- Strong Troubleshooting and debugging skills
- A strong passion for writing simple, clean, and efficient code.
- 3 years of experience with the Django framework and other backend technologies.
- Knowledge of NodeJS
- Experience with building, modifying, and extending API endpoints (REST or GraphQL) for data retrieval and persistence.
- Understand how to use a database like Postgres (preferred choice), SQLite, MongoDB, MySQL.
- Experience creating high-performance applications.
- Experience with messaging and broker tools - Rabbitmq, MQTT
- Experience with SQL and NoSQL databases
- Experience with the full software development life cycle, including requirements collection, design, implementation, testing, and operational support.
- Knowledge of web services
- Proficient understanding of code versioning tools Git.
- Hands-on experience deploying and managing infrastructure with CloudFormation/Terraform
- Experience managing AWS infrastructure.
- Hands-on experience in Linux environment.
- Basic understanding of Kubernetes/Docker orchestration.
- Manges existing infrastructure/Pipelines/Engineering tools (On-Prem or AWS) for the engineering team (Build servers/Jenkins nodes etc.)
- Experience with scrum or other agile software development methodology.
- Excellent verbal and written communication, teamwork, decision making and
- influencing skills.
- Handle customer calls/emails regarding technical issues for end-users.
- Strong communication skills
- Attention to detail.
Experience:
- Min 3 year experience
Location:
- Ahmedabad Office Or,
- Work from home
Timings:
- 40 hours a week with a rotational shift every month.
Read more
Remote only
3 - 6 yrs
₹8L - ₹16L / yr
User Interface (UI) Design
HTML/CSS
ThreeJs (Three.js)
WebGL
API
+3 more
MUST have skills:
- BE Computer Science, MCA or equivalent
- Cloud app development experience
- Strong Understanding of Threejs Concepts
- Experience working with Threejs Immersive projects using Camera,Lights,Textures,Animations and shaders.
- Understanding of APIs, pagination, searching, sorting
- Experience in setting up Projects with Webpack and the various loaders and plugins.
- Creating high performance UI etc.
- Proficient understanding of web markup, including html5 and css3.
- Basic understanding of 3D mathematics.
- 1 years of experience in 3D technologies such as WebGL, Three.js, etc
- Experience working in a Distributed/Cloud-based environment.
Experience:
- Min 3 year experience
- Startup experience is a must.
Location
- Remotely, anywhere in India
Timings:
- 40 hours a week but with 4 hours a day overlapping with client timezone. Typically clients are in California PST Timezone.
Position:
- Full time/Direct
- We have great benefits such as PF, medical insurance, 12 annual company holidays, 12 PTO leaves per year, annual increments, Diwali bonus, spot bonuses and other incentives etc.
- We dont believe in locking in people with large notice periods. You will stay here because you love the company. We have only a 15 days notice period.
Read more
Bengaluru (Bangalore)
3 - 6 yrs
₹8L - ₹12L / yr
Qlik
Qlikview
PHP
MySQL
Business Intelligence (BI)
+1 more
- Experience in Developing QlikView / Qlik Sense BI visualizations and dashboards. QlikSense and NPrintiing experience is a must.
- Strong knowledge of building Qlik Sense dashboards applying business rules and data validations.
- In-depth understanding of Qlik Sense Server Architecture, building QVDs and QVFs, applying business rules and data validations.
- Familiarity with charts and graphs, creating complex visualizations for dynamic aggregation Good Knowledge of Incremental data loading concepts and data modeling.
- Additionally, Qlik developers have to troubleshoot issues and provide the necessary solution
Read more
Bengaluru (Bangalore)
3 - 7 yrs
₹8L - ₹15L / yr
SQL Server Analysis Services (SSAS)
SSAS
Azure Analysis
Azure Analysis Service
SQL
+1 more
1) Experience in developing, optimizing and administering Tabular Models in 'Azure Analysis service' or 'SSAS'.
2) Expertise in developing OLAP cubes and developing complex calculations, Aggregations, implementing a dynamic security model using MDX/ DAX functions in - Azure Analysis service- or (SSAS)
3) Extensively used performance monitor/SQL profiler/DMVs to solve deadlocks, to monitor long-running queries and trouble-shoot cubes SQL and T-SQL. Roles & Responsibilities : 1) "SSAS" OR "Azure Analysis services" Lead Developer with 7+ years of experience in SSAS Azure Data Model Development, SSAS Data model Deployment in Azure, Querying data from SSAS Azure to build Reports.
2) Design and Create SSAS/OLAP/OLTP/Tabular cubes and automate processes for analytical needs.
3) Writing optimized SQL queries for integration with other applications, Maintain data quality and overseeing database security, Partitions and Index.
2) Expertise in developing OLAP cubes and developing complex calculations, Aggregations, implementing a dynamic security model using MDX/ DAX functions in - Azure Analysis service- or (SSAS)
3) Extensively used performance monitor/SQL profiler/DMVs to solve deadlocks, to monitor long-running queries and trouble-shoot cubes SQL and T-SQL. Roles & Responsibilities : 1) "SSAS" OR "Azure Analysis services" Lead Developer with 7+ years of experience in SSAS Azure Data Model Development, SSAS Data model Deployment in Azure, Querying data from SSAS Azure to build Reports.
2) Design and Create SSAS/OLAP/OLTP/Tabular cubes and automate processes for analytical needs.
3) Writing optimized SQL queries for integration with other applications, Maintain data quality and overseeing database security, Partitions and Index.
Read more
Bengaluru (Bangalore), Mumbai, Gurugram
1 - 4 yrs
₹5L - ₹8L / yr
Python
Automation
SaaS
Javascript
API
+1 more
The Opportunity
We are looking for an automation specialist who will play a key role in Sattva's digitisation
initiatives. Our rapid growth in the last year has underscored the importance of
technology-driven solutions to manage business processes at scale.
Currently our tech landscape is a collection of best-of-breed SaaS solutions that need to be
integrated/extended based on business needs. This role involves identifying automation
opportunities and realising them through low/no-code platforms like AppSheet, Zapier, etc. It is a technical role that also involves interfacing with people across different Business Units within Sattva. It offers the opportunity to work with best-in-class SaaS solutions like Google Workspace, FreshTeams, ClickUp, and QuickBooks.
Responsibilities
● Analyse existing landscape of SaaS solutions to identify automation gaps in key
business process
● Integrate best-of-breed SaaS solutions using APIs and Low/No-Code tools
● Build apps to extend existing SaaS solutions like FreshTeams, QuickBooks, ClickUp, etc
using available APIs and SDKs
● Configure SaaS solutions to meet the needs of a specific Business Unit or of a defined
security policy
● Build Slack apps to integrate with SaaS solutions in the landscape
● Troubleshoot technical issues with the configured solutions in the landscape
Ideal Candidate Profile
● 1+ years of experience in integrating/extending SaaS solutions
● Solid expertise in developing automation scripts and applications using Javascript or
Python
● Strong problem-solving ability
● Excellent communication skills
● Proven ability to interface with multiple stakeholders across business vertical
We are looking for an automation specialist who will play a key role in Sattva's digitisation
initiatives. Our rapid growth in the last year has underscored the importance of
technology-driven solutions to manage business processes at scale.
Currently our tech landscape is a collection of best-of-breed SaaS solutions that need to be
integrated/extended based on business needs. This role involves identifying automation
opportunities and realising them through low/no-code platforms like AppSheet, Zapier, etc. It is a technical role that also involves interfacing with people across different Business Units within Sattva. It offers the opportunity to work with best-in-class SaaS solutions like Google Workspace, FreshTeams, ClickUp, and QuickBooks.
Responsibilities
● Analyse existing landscape of SaaS solutions to identify automation gaps in key
business process
● Integrate best-of-breed SaaS solutions using APIs and Low/No-Code tools
● Build apps to extend existing SaaS solutions like FreshTeams, QuickBooks, ClickUp, etc
using available APIs and SDKs
● Configure SaaS solutions to meet the needs of a specific Business Unit or of a defined
security policy
● Build Slack apps to integrate with SaaS solutions in the landscape
● Troubleshoot technical issues with the configured solutions in the landscape
Ideal Candidate Profile
● 1+ years of experience in integrating/extending SaaS solutions
● Solid expertise in developing automation scripts and applications using Javascript or
Python
● Strong problem-solving ability
● Excellent communication skills
● Proven ability to interface with multiple stakeholders across business vertical
Read more
Pune
5 - 9 yrs
₹8L - ₹14L / yr
Java
JDBC
Webservices
Multithreading
Core Java
Member Technical Staff - Data Source Adapters
About Tibco
Headquartered in Palo Alto, CA, TIBCO Software enables businesses to reach new heights on their path to digital distinction and innovation. From systems to devices and people, we interconnect everything, capture data in real time wherever it is, and augment the intelligence of organizations through analytical insights. Thousands of customers around the globe rely on us to build compelling experiences, energize operations, and propel innovation. Our teams flourish on new ideas and welcome individuals who thrive in transforming challenges
into opportunities. From designing and building amazing products to providing excellent service;we encourage and are shaped by bold thinkers, problem-solvers, and self-starters. We are always adapting and providing exciting opportunities for our employees to grow, learn and excel.
We value the customers and employees that define who we are; dynamic individuals willing to take the risks necessary to make big ideas come to life and who are comfortable collaborating in our creative, optimistic environment. TIBCO – we are just scratching the surface.
Who You’ll Work With
TIBCO Data Virtualization (TDV) is an enterprise data virtualization solution that orchestrates access to multiple and varied data sources, delivering data sets and IT curated data services to any analytics solution. TDV is a Java based enterprise-grade database engine supporting all phases of data virtualization development, run-time, and management. It is the trusted solution of choice for the top enterprises in verticals like finance, energy, pharmaceutical, retail, telecom
etc. Are you interested in working on leading edge technologies? Are you fascinated with Big Data,Cloud, Federation and Data Pipelines? If you have built software frameworks and have a background in Data Technologies, Application Servers, Business Intelligence etc this opportunity is for you.
Overview
TIBCO Data Virtualization team is looking for a engineer with experience in the area of SQL Data Access using JDBC, WebServices, and native client access for both relational as well as non-relational sources. You will have expertise in developing metadata layer around disparate data sources and implementing a query runtime engine for data access, including plugin management. The core responsibilities will include designing, implementing and maintaining the
subsystem that abstracts data and metadata access across different relational database flavors, BigData sources, Cloud applications, enterprise application packages like SAP R/3, SAP BW, Salesforce etc. The server is implemented by a multi-million line source base in Java, so the ability to understand and integrate with existing code is an absolute must. The core runtime is a complex multi-threaded system and the successful candidate will demonstrate complete expertise in handling features geared towards concurrent transactions in a low latency, high throughput and scalable server environment. The candidate will have the opportunity to work in a collaborative environment with leading database experts in building the most robust, scalable and high performing database server.
Job Responsibilities
• In this crucial role as a Data Source Engineer, you will:
• Drive enhancements to existing data-source layer capabilities
• Understand and interface with 3rd party JDBC drivers
• Ensure all security-related aspects of driver operation function with zero defects
• Diagnose customer issues and perform bug fixes
• Suggest and implement performance optimizations
Required Skills
• Bachelor’s degree with 3+ years of experience, or equivalent work experience.
• 3+ years programming experience
• 2+ years of Java based server side experience
• 1+ years experience with at least one of JDBC, ODBC, SOAP, REST, and OData
• 1+ years of multithreading experience
• Proficiency in both spoken and written communication in English is a must
Desired Skills
• Strong object-oriented design background
• Strong SQL & database background
• Experience developing or configuring cloud-based software
• Experience with all lifecycle aspects of enterprise software
• Experience working with large, pre-existing code bases
• Experience with enterprise security technologies
• Experience with any of the following types of data sources: Relational, Big Data, Cloud, Data
Lakes, and Enterprise Applications.
• Experience using Hive, Hadoop, Impala, Cloudera, and other Big Data technologies
About Tibco
Headquartered in Palo Alto, CA, TIBCO Software enables businesses to reach new heights on their path to digital distinction and innovation. From systems to devices and people, we interconnect everything, capture data in real time wherever it is, and augment the intelligence of organizations through analytical insights. Thousands of customers around the globe rely on us to build compelling experiences, energize operations, and propel innovation. Our teams flourish on new ideas and welcome individuals who thrive in transforming challenges
into opportunities. From designing and building amazing products to providing excellent service;we encourage and are shaped by bold thinkers, problem-solvers, and self-starters. We are always adapting and providing exciting opportunities for our employees to grow, learn and excel.
We value the customers and employees that define who we are; dynamic individuals willing to take the risks necessary to make big ideas come to life and who are comfortable collaborating in our creative, optimistic environment. TIBCO – we are just scratching the surface.
Who You’ll Work With
TIBCO Data Virtualization (TDV) is an enterprise data virtualization solution that orchestrates access to multiple and varied data sources, delivering data sets and IT curated data services to any analytics solution. TDV is a Java based enterprise-grade database engine supporting all phases of data virtualization development, run-time, and management. It is the trusted solution of choice for the top enterprises in verticals like finance, energy, pharmaceutical, retail, telecom
etc. Are you interested in working on leading edge technologies? Are you fascinated with Big Data,Cloud, Federation and Data Pipelines? If you have built software frameworks and have a background in Data Technologies, Application Servers, Business Intelligence etc this opportunity is for you.
Overview
TIBCO Data Virtualization team is looking for a engineer with experience in the area of SQL Data Access using JDBC, WebServices, and native client access for both relational as well as non-relational sources. You will have expertise in developing metadata layer around disparate data sources and implementing a query runtime engine for data access, including plugin management. The core responsibilities will include designing, implementing and maintaining the
subsystem that abstracts data and metadata access across different relational database flavors, BigData sources, Cloud applications, enterprise application packages like SAP R/3, SAP BW, Salesforce etc. The server is implemented by a multi-million line source base in Java, so the ability to understand and integrate with existing code is an absolute must. The core runtime is a complex multi-threaded system and the successful candidate will demonstrate complete expertise in handling features geared towards concurrent transactions in a low latency, high throughput and scalable server environment. The candidate will have the opportunity to work in a collaborative environment with leading database experts in building the most robust, scalable and high performing database server.
Job Responsibilities
• In this crucial role as a Data Source Engineer, you will:
• Drive enhancements to existing data-source layer capabilities
• Understand and interface with 3rd party JDBC drivers
• Ensure all security-related aspects of driver operation function with zero defects
• Diagnose customer issues and perform bug fixes
• Suggest and implement performance optimizations
Required Skills
• Bachelor’s degree with 3+ years of experience, or equivalent work experience.
• 3+ years programming experience
• 2+ years of Java based server side experience
• 1+ years experience with at least one of JDBC, ODBC, SOAP, REST, and OData
• 1+ years of multithreading experience
• Proficiency in both spoken and written communication in English is a must
Desired Skills
• Strong object-oriented design background
• Strong SQL & database background
• Experience developing or configuring cloud-based software
• Experience with all lifecycle aspects of enterprise software
• Experience working with large, pre-existing code bases
• Experience with enterprise security technologies
• Experience with any of the following types of data sources: Relational, Big Data, Cloud, Data
Lakes, and Enterprise Applications.
• Experience using Hive, Hadoop, Impala, Cloudera, and other Big Data technologies
Read more
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs