Cutshort logo
SQL Query Analyzer Jobs in Chennai

11+ SQL Query Analyzer Jobs in Chennai | SQL Query Analyzer Job openings in Chennai

Apply to 11+ SQL Query Analyzer Jobs in Chennai on CutShort.io. Explore the latest SQL Query Analyzer Job opportunities across top companies like Google, Amazon & Adobe.

icon
Agiletech Info Solutions pvt ltd
Chennai
4 - 7 yrs
₹7L - ₹16L / yr
SQL server
SSIS
ETL
ETL QA
ADF
+3 more
  • Proficient with SQL Server/T-SQL programming in creation and optimization of complex Stored Procedures, UDF, CTE and Triggers
  • Overall Experience should be between 4 to 7 years
  • Experience working in a data warehouse environment and a strong understanding of dimensional data modeling concepts. Experience in SQL server, DW principles and SSIS.
  • Should have strong experience in building data transformations with SSIS including importing data from files, and moving data from source to destination.
  • Creating new SSIS packages or modifying existing SSIS packages using SQL server
  • Debug and fine-tune SSIS processes to ensure accurate and efficient movement of data. Experience with ETL testing & data validation.
  • 1+ years of experience with Azure services like Azure Data Factory, Data flow, Azure blob Storage, etc.
  • 1+ years of experience with developing Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime.
  • Must be able to build Business Intelligence solutions in a collaborative, agile development environment.
  • Reporting experience with Power BI or SSRS is a plus.
  • Experience working on an Agile/Scrum team preferred.
  • Proven strong problem-solving skills, troubleshooting, and root cause analysis.
  • Excellent written and verbal communication skills.
Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Pune, Hyderabad, Ahmedabad, Chennai
3 - 7 yrs
₹8L - ₹15L / yr
AWS Lambda
Amazon S3
Amazon VPC
Amazon EC2
Amazon Redshift
+3 more

Technical Skills:


  • Ability to understand and translate business requirements into design.
  • Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
  • Experience in creating ETL jobs using Python/PySpark.
  • Proficiency in creating AWS Lambda functions for event-based jobs.
  • Knowledge of automating ETL processes using AWS Step Functions.
  • Competence in building data warehouses and loading data into them.


Responsibilities:


  • Understand business requirements and translate them into design.
  • Assess AWS infrastructure needs for development work.
  • Develop ETL jobs using Python/PySpark to meet requirements.
  • Implement AWS Lambda for event-based tasks.
  • Automate ETL processes using AWS Step Functions.
  • Build data warehouses and manage data loading.
  • Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.
Read more
Optisol Business Solutions Pvt Ltd
Veeralakshmi K
Posted by Veeralakshmi K
Remote, Chennai, Coimbatore, Madurai
4 - 10 yrs
₹10L - ₹15L / yr
skill iconPython
SQL
Amazon Redshift
Amazon RDS
AWS Simple Notification Service (SNS)
+5 more

Role Summary


As a Data Engineer, you will be an integral part of our Data Engineering team supporting an event-driven server less data engineering pipeline on AWS cloud, responsible for assisting in the end-to-end analysis, development & maintenance of data pipelines and systems (DataOps). You will work closely with fellow data engineers & production support to ensure the availability and reliability of data for analytics and business intelligence purposes.


Requirements:


·      Around 4 years of working experience in data warehousing / BI system.

·      Strong hands-on experience with Snowflake AND strong programming skills in Python

·      Strong hands-on SQL skills

·      Knowledge with any of the cloud databases such as Snowflake,Redshift,Google BigQuery,RDS,etc.

·      Knowledge on debt for cloud databases

·      AWS Services such as SNS, SQS, ECS, Docker, Kinesis & Lambda functions

·      Solid understanding of ETL processes, and data warehousing concepts

·      Familiarity with version control systems (e.g., Git/bit bucket, etc.) and collaborative development practices in an agile framework

·      Experience with scrum methodologies

·      Infrastructure build tools such as CFT / Terraform is a plus.

·      Knowledge on Denodo, data cataloguing tools & data quality mechanisms is a plus.

·      Strong team player with good communication skills.


Overview Optisol Business Solutions


OptiSol was named on this year's Best Companies to Work for list by Great place to work. We are a team of about 500+ Agile employees with a development center in India and global offices in the US, UK (United Kingdom), Australia, Ireland, Sweden, and Dubai. 16+ years of joyful journey and we have built about 500+ digital solutions. We have 200+ happy and satisfied clients across 24 countries.


Benefits, working with Optisol


·      Great Learning & Development program

·      Flextime, Work-at-Home & Hybrid Options

·      A knowledgeable, high-achieving, experienced & fun team.

·      Spot Awards & Recognition.

·      The chance to be a part of next success story.

·      A competitive base salary.


More Than Just a Job, We Offer an Opportunity To Grow. Are you the one, who looks out to Build your Future & Build your Dream? We have the Job for you, to make your dream comes true.

Read more
A Product Based Client,Chennai

A Product Based Client,Chennai

Agency job
via SangatHR by Anna Poorni
Chennai
4 - 8 yrs
₹10L - ₹15L / yr
Data Warehouse (DWH)
Informatica
ETL
Spark
PySpark
+2 more

Analytics Job Description

We are hiring an Analytics Engineer to help drive our Business Intelligence efforts. You will

partner closely with leaders across the organization, working together to understand the how

and why of people, team and company challenges, workflows and culture. The team is

responsible for delivering data and insights that drive decision-making, execution, and

investments for our product initiatives.

You will work cross-functionally with product, marketing, sales, engineering, finance, and our

customer-facing teams enabling them with data and narratives about the customer journey.

You’ll also work closely with other data teams, such as data engineering and product analytics,

to ensure we are creating a strong data culture at Blend that enables our cross-functional partners

to be more data-informed.


Role : DataEngineer 

Please find below the JD for the DataEngineer Role..

  Location: Guindy,Chennai

How you’ll contribute:

• Develop objectives and metrics, ensure priorities are data-driven, and balance short-

term and long-term goals


• Develop deep analytical insights to inform and influence product roadmaps and

business decisions and help improve the consumer experience

• Work closely with GTM and supporting operations teams to author and develop core

data sets that empower analyses

• Deeply understand the business and proactively spot risks and opportunities

• Develop dashboards and define metrics that drive key business decisions

• Build and maintain scalable ETL pipelines via solutions such as Fivetran, Hightouch,

and Workato

• Design our Analytics and Business Intelligence architecture, assessing and

implementing new technologies that fitting


• Work with our engineering teams to continually make our data pipelines and tooling

more resilient


Who you are:

• Bachelor’s degree or equivalent required from an accredited institution with a

quantitative focus such as Economics, Operations Research, Statistics, Computer Science OR 1-3 Years of Experience as a Data Analyst, Data Engineer, Data Scientist

• Must have strong SQL and data modeling skills, with experience applying skills to

thoughtfully create data models in a warehouse environment.

• A proven track record of using analysis to drive key decisions and influence change

• Strong storyteller and ability to communicate effectively with managers and

executives

• Demonstrated ability to define metrics for product areas, understand the right

questions to ask and push back on stakeholders in the face of ambiguous, complex

problems, and work with diverse teams with different goals

• A passion for documentation.

• A solution-oriented growth mindset. You’ll need to be a self-starter and thrive in a

dynamic environment.

• A bias towards communication and collaboration with business and technical

stakeholders.

• Quantitative rigor and systems thinking.

• Prior startup experience is preferred, but not required.

• Interest or experience in machine learning techniques (such as clustering, decision

tree, and segmentation)

• Familiarity with a scientific computing language, such as Python, for data wrangling

and statistical analysis

• Experience with a SQL focused data transformation framework such as dbt

• Experience with a Business Intelligence Tool such as Mode/Tableau


Mandatory Skillset:


-Very Strong in SQL

-Spark OR pyspark OR Python

-Shell Scripting


Read more
TekClan
Tanu Shree
Posted by Tanu Shree
Chennai
4 - 7 yrs
Best in industry
MS SQLServer
SQL Programming
SQL
ETL
ETL management
+5 more

Company - Tekclan Software Solutions

Position – SQL Developer

Experience – Minimum 4+ years of experience in MS SQL server, SQL Programming, ETL development.

Location - Chennai


We are seeking a highly skilled SQL Developer with expertise in MS SQL Server, SSRS, SQL programming, writing stored procedures, and proficiency in ETL using SSIS. The ideal candidate will have a strong understanding of database concepts, query optimization, and data modeling.


Responsibilities:

1. Develop, optimize, and maintain SQL queries, stored procedures, and functions for efficient data retrieval and manipulation.

2. Design and implement ETL processes using SSIS for data extraction, transformation, and loading from various sources.

3. Collaborate with cross-functional teams to gather business requirements and translate them into technical specifications.

4. Create and maintain data models, ensuring data integrity, normalization, and performance.

5. Generate insightful reports and dashboards using SSRS to facilitate data-driven decision making.

6. Troubleshoot and resolve database performance issues, bottlenecks, and data inconsistencies.

7. Conduct thorough testing and debugging of SQL code to ensure accuracy and reliability.

8. Stay up-to-date with emerging trends and advancements in SQL technologies and provide recommendations for improvement.

9. Should be an independent and individual contributor.


Requirements:

1. Minimum of 4+ years of experience in MS SQL server, SQL Programming, ETL development.

2. Proven experience as a SQL Developer with a strong focus on MS SQL Server.

3. Proficiency in SQL programming, including writing complex queries, stored procedures, and functions.

4. In-depth knowledge of ETL processes and hands-on experience with SSIS.

5. Strong expertise in creating reports and dashboards using SSRS.

6. Familiarity with database design principles, query optimization, and data modeling.

7. Experience with performance tuning and troubleshooting SQL-related issues.

8. Excellent problem-solving skills and attention to detail.

9. Strong communication and collaboration abilities.

10. Ability to work independently and handle multiple tasks simultaneously.


Preferred Skills:

1. Certification in MS SQL Server or related technologies.

2. Knowledge of other database systems such as Oracle or MySQL.

3. Familiarity with data warehousing concepts and tools.

4. Experience with version control systems.

Read more
InfoCepts
Lalsaheb Bepari
Posted by Lalsaheb Bepari
Chennai, Pune, Nagpur
7 - 10 yrs
₹5L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+5 more

Responsibilities:

 

• Designing Hive/HCatalog data model includes creating table definitions, file formats, compression techniques for Structured & Semi-structured data processing

• Implementing Spark processing based ETL frameworks

• Implementing Big data pipeline for Data Ingestion, Storage, Processing & Consumption

• Modifying the Informatica-Teradata & Unix based data pipeline

• Enhancing the Talend-Hive/Spark & Unix based data pipelines

• Develop and Deploy Scala/Python based Spark Jobs for ETL processing

• Strong SQL & DWH concepts.

 

Preferred Background:

 

• Function as integrator between business needs and technology solutions, helping to create technology solutions to meet clients’ business needs

• Lead project efforts in defining scope, planning, executing, and reporting to stakeholders on strategic initiatives

• Understanding of EDW system of business and creating High level design document and low level implementation document

• Understanding of Big Data Lake system of business and creating High level design document and low level implementation document

• Designing Big data pipeline for Data Ingestion, Storage, Processing & Consumption

Read more
Ganit Business Solutions

at Ganit Business Solutions

3 recruiters
Viswanath Subramanian
Posted by Viswanath Subramanian
Remote, Chennai, Bengaluru (Bangalore), Mumbai
3 - 7 yrs
₹12L - ₹25L / yr
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Computer Vision
skill iconR Programming
+5 more

Ganit has flipped the data science value chain as we do not start with a technique but for us, consumption comes first. With this philosophy, we have successfully scaled from being a small start-up to a 200 resource company with clients in the US, Singapore, Africa, UAE, and India. 

We are looking for experienced data enthusiasts who can make the data talk to them. 

 

You will: 

  • Understand business problems and translate business requirements into technical requirements. 
  • Conduct complex data analysis to ensure data quality & reliability i.e., make the data talk by extracting, preparing, and transforming it. 
  • Identify, develop and implement statistical techniques and algorithms to address business challenges and add value to the organization. 
  • Gather requirements and communicate findings in the form of a meaningful story with the stakeholders  
  • Build & implement data models using predictive modelling techniques. Interact with clients and provide support for queries and delivery adoption. 
  • Lead and mentor data analysts. 

 

We are looking for someone who has: 

 

  • Apart from your love for data and ability to code even while sleeping you would need the following. 
  • Minimum of 02 years of experience in designing and delivery of data science solutions. 
  • You should have successful projects of retail/BFSI/FMCG/Manufacturing/QSR in your kitty to show-off. 
  • Deep understanding of various statistical techniques, mathematical models, and algorithms to start the conversation with the data in hand. 
  • Ability to choose the right model for the data and translate that into a code using R, Python, VBA, SQL, etc. 
  • Bachelors/Masters degree in Engineering/Technology or MBA from Tier-1 B School or MSc. in Statistics or Mathematics 

Skillset Required:

  • Regression
  • Classification
  • Predictive Modelling
  • Prescriptive Modelling
  • Python
  • R
  • Descriptive Modelling
  • Time Series
  • Clustering
  •  

What is in it for you: 

 

  • Be a part of building the biggest brand in Data science. 
  • An opportunity to be a part of a young and energetic team with a strong pedigree. 
  • Work on awesome projects across industries and learn from the best in the industry, while growing at a hyper rate. 

 

Please Note:  

 

At Ganit, we are looking for people who love problem solving. You are encouraged to apply even if your experience does not precisely match the job description above. Your passion and skills will stand out and set you apart—especially if your career has taken some extraordinary twists and turns over the years. We welcome diverse perspectives, people who think rigorously and are not afraid to challenge assumptions in a problem. Join us and punch above your weight! 

Ganit is an equal opportunity employer and is committed to providing a work environment that is free from harassment and discrimination. 

All recruitment, selection procedures and decisions will reflect Ganit’s commitment to providing equal opportunity. All potential candidates will be assessed according to their skills, knowledge, qualifications, and capabilities. No regard will be given to factors such as age, gender, marital status, race, religion, physical impairment, or political opinions. 

Read more
Amagi Media Labs

at Amagi Media Labs

3 recruiters
Rajesh C
Posted by Rajesh C
Chennai
15 - 18 yrs
Best in industry
Data architecture
Architecture
Data Architect
Architect
skill iconJava
+5 more
Job Title: Data Architect
Job Location: Chennai
Job Summary

The Engineering team is seeking a Data Architect. As a Data Architect, you will drive a
Data Architecture strategy across various Data Lake platforms. You will help develop
reference architecture and roadmaps to build highly available, scalable and distributed
data platforms using cloud based solutions to process high volume, high velocity and
wide variety of structured and unstructured data. This role is also responsible for driving
innovation, prototyping, and recommending solutions. Above all, you will influence how
users interact with Conde Nast’s industry-leading journalism.
Primary Responsibilities
Data Architect is responsible for
• Demonstrated technology and personal leadership experience in architecting,
designing, and building highly scalable solutions and products.
• Enterprise scale expertise in data management best practices such as data integration,
data security, data warehousing, metadata management and data quality.
• Extensive knowledge and experience in architecting modern data integration
frameworks, highly scalable distributed systems using open source and emerging data
architecture designs/patterns.
• Experience building external cloud (e.g. GCP, AWS) data applications and capabilities is
highly desirable.
• Expert ability to evaluate, prototype and recommend data solutions and vendor
technologies and platforms.
• Proven experience in relational, NoSQL, ELT/ETL technologies and in-memory
databases.
• Experience with DevOps, Continuous Integration and Continuous Delivery technologies
is desirable.
• This role requires 15+ years of data solution architecture, design and development
delivery experience.
• Solid experience in Agile methodologies (Kanban and SCRUM)
Required Skills
• Very Strong Experience in building Large Scale High Performance Data Platforms.
• Passionate about technology and delivering solutions for difficult and intricate
problems. Current on Relational Databases and No sql databases on cloud.
• Proven leadership skills, demonstrated ability to mentor, influence and partner with
cross teams to deliver scalable robust solutions..
• Mastery of relational database, NoSQL, ETL (such as Informatica, Datastage etc) /ELT
and data integration technologies.
• Experience in any one of Object Oriented Programming (Java, Scala, Python) and
Spark.
• Creative view of markets and technologies combined with a passion to create the
future.
• Knowledge on cloud based Distributed/Hybrid data-warehousing solutions and Data
Lake knowledge is mandate.
• Good understanding of emerging technologies and its applications.
• Understanding of code versioning tools such as GitHub, SVN, CVS etc.
• Understanding of Hadoop Architecture and Hive SQL
• Knowledge in any one of the workflow orchestration
• Understanding of Agile framework and delivery

Preferred Skills:
● Experience in AWS and EMR would be a plus
● Exposure in Workflow Orchestration like Airflow is a plus
● Exposure in any one of the NoSQL database would be a plus
● Experience in Databricks along with PySpark/Spark SQL would be a plus
● Experience with the Digital Media and Publishing domain would be a
plus
● Understanding of Digital web events, ad streams, context models
About Condé Nast
CONDÉ NAST INDIA (DATA)
Over the years, Condé Nast successfully expanded and diversified into digital, TV, and social
platforms - in other words, a staggering amount of user data. Condé Nast made the right
move to invest heavily in understanding this data and formed a whole new Data team
entirely dedicated to data processing, engineering, analytics, and visualization. This team
helps drive engagement, fuel process innovation, further content enrichment, and increase
market revenue. The Data team aimed to create a company culture where data was the
common language and facilitate an environment where insights shared in real-time could
improve performance.
The Global Data team operates out of Los Angeles, New York, Chennai, and London. The
team at Condé Nast Chennai works extensively with data to amplify its brands' digital
capabilities and boost online revenue. We are broadly divided into four groups, Data
Intelligence, Data Engineering, Data Science, and Operations (including Product and
Marketing Ops, Client Services) along with Data Strategy and monetization. The teams built
capabilities and products to create data-driven solutions for better audience engagement.
What we look forward to:
We want to welcome bright, new minds into our midst and work together to create diverse
forms of self-expression. At Condé Nast, we encourage the imaginative and celebrate the
extraordinary. We are a media company for the future, with a remarkable past. We are
Condé Nast, and It Starts Here.
Read more
vThink Global Technologies
Balasubramanian Ramaiyar
Posted by Balasubramanian Ramaiyar
Chennai
4 - 7 yrs
₹8L - ₹15L / yr
SQL
ETL
Informatica
Data Warehouse (DWH)
Stored Procedures
+1 more
We are looking for a strong SQL Developer well versed and hands-on in SQL, Stored Procedures, Joins and ETL. A data-savvy individual with advanced SQL skills.
Read more
Telecom  Client

Telecom Client

Agency job
via Eurka IT SOL by Srikanth a
Chennai
5 - 13 yrs
₹9L - ₹28L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+6 more
  • Demonstrable experience owning and developing big data solutions, using Hadoop, Hive/Hbase, Spark, Databricks, ETL/ELT for 5+ years

·       10+ years of Information Technology experience, preferably with Telecom / wireless service providers.

·       Experience in designing data solution following Agile practices (SAFe methodology); designing for testability, deployability and releaseability; rapid prototyping, data modeling, and decentralized innovation

  • DataOps mindset: allowing the architecture of a system to evolve continuously over time, while simultaneously supporting the needs of current users
  • Create and maintain Architectural Runway, and Non-Functional Requirements.
  • Design for Continuous Delivery Pipeline (CI/CD data pipeline) and enables Built-in Quality & Security from the start.

·       To be able to demonstrate an understanding and ideally use of, at least one recognised architecture framework or standard e.g. TOGAF, Zachman Architecture Framework etc

·       The ability to apply data, research, and professional judgment and experience to ensure our products are making the biggest difference to consumers

·       Demonstrated ability to work collaboratively

·       Excellent written, verbal and social skills - You will be interacting with all types of people (user experience designers, developers, managers, marketers, etc.)

·       Ability to work in a fast paced, multiple project environment on an independent basis and with minimal supervision

·       Technologies: .NET, AWS, Azure; Azure Synapse, Nifi, RDS, Apache Kafka, Azure Data bricks, Azure datalake storage, Power BI, Reporting Analytics, QlickView, SQL on-prem Datawarehouse; BSS, OSS & Enterprise Support Systems

Read more
LatentView Analytics

at LatentView Analytics

3 recruiters
Kannikanti madhuri
Posted by Kannikanti madhuri
Chennai
1 - 4 yrs
₹2L - ₹10L / yr
Business Intelligence (BI)
Analytics
SQL server
Data Visualization
Tableau
+7 more
Title: Analyst - Business AnalyticsExperience: 1 - 4 YearsLocation: ChennaiOpen Positions: 17Job Description:Roles & Responsibilities:- Designing and implementing analytical projects that drive business goals and decisions leveraging structured and unstructured data.- Generating a compelling story from insights and trends in a complex data environment.- Working shoulder-to-shoulder with business partners to come up with creative approaches to solve the business problem.- Creating dashboards for business heads by exploring available data assets.Qualifications:- Overall 1+ Years of Business Analytics experience with strong communication skills.- Bachelor or Master degree in computer science is preferred.- Excellent problem solving and client orientation skills.Skills Required:- Ability to program in Advanced SQL is must.- Hands-on experience in Modeling tools such as R or Python- Experience in Visualization tools such as Power BI, Tableau, Looker, etc., would be a big plus.- Analytics certifications from recognized platforms would be a plus - Udemy, Coursera, EDX,etc.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort