11+ SPSS Jobs in Hyderabad | SPSS Job openings in Hyderabad
Apply to 11+ SPSS Jobs in Hyderabad on CutShort.io. Explore the latest SPSS Job opportunities across top companies like Google, Amazon & Adobe.
Interpret data, analyze results using statistical techniques and provide ongoing reports
Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality
Acquire data from primary or secondary data sources and maintain databases/data systems
Identify, analyze, and interpret trends or patterns in complex data sets
Filter and “clean” data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems
Work with management to prioritize business and information needs
Locate and define new process improvement opportunities
Technical Skills:
- Ability to understand and translate business requirements into design.
- Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
- Experience in creating ETL jobs using Python/PySpark.
- Proficiency in creating AWS Lambda functions for event-based jobs.
- Knowledge of automating ETL processes using AWS Step Functions.
- Competence in building data warehouses and loading data into them.
Responsibilities:
- Understand business requirements and translate them into design.
- Assess AWS infrastructure needs for development work.
- Develop ETL jobs using Python/PySpark to meet requirements.
- Implement AWS Lambda for event-based tasks.
- Automate ETL processes using AWS Step Functions.
- Build data warehouses and manage data loading.
- Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.
at Nagarro Software
👋🏼We're Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (19000+ experts across 33 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!
REQUIREMENTS:
- Bachelor's/master’s degree or equivalent experience in computer science
- Overall, 10-12 years of experience with at least 4 years of experience with Jitterbit Harmony platform and Jitterbit Cloud.
- Should have the experience to technically lead groom developers who might be geographically distributed
- Knowledge of Change & Incident Management process (JIRA etc.)
RESPONSIBILITIES:
- Responsible for end-to-end implementation of integration use case using Jitterbit platform.
- Coordinate with all the stakeholders for successful project execution.
- Responsible for requirement gathering, Integration strategy, design, implementation etc.
- Should have strong hands-on experience in designing, building, and deploying integration solution using Jitterbit harmony Platform.
- Should have developed enterprise services using REST based APIs, SOAP Web Services and use of different Jitterbit connectors (Salesforce, DB, JMS, File connector, Http/Https connectors, any TMS connector).
- Should have knowledge of Custom Jitterbit Plugins and Custom Connectors.
- Experience in Jitterbit implementations including security, logging, error handling, scalability and clustering.
- Strong experience in Jitterbit Script, XSLT and JavaScript.
- Install, configure and deploy solution using Jitterbit.
- Provide test support for bug fixes during all stages of test cycle.
- Provide support for deployment and post go-live.
- Knowledge of professional software engineering practices & best practices for the full software development life cycle including coding standards, code reviews, source control management, build processes, testing,
- Understand the requirements, create necessary documentation, give presentations to clients and get necessary approvals and create design doc for the release.
- Estimate the tasks and discuss with the clients on Risks/Issues.
- Working on the specific module independently and test the application. Code reviews suggest the team on best practices.
- Create necessary documentation, give presentations to clients and get necessary approvals.
- Broad knowledge of web standards relating to APIs (OAuth, SSL, CORS, JWT, etc.)
at Softobiz Technologies Private limited
Responsibilities
- Design and implement Azure BI infrastructure, ensure overall quality of delivered solution
- Develop analytical & reporting tools, promote and drive adoption of developed BI solutions
- Actively participate in BI community
- Establish and enforce technical standards and documentation
- Participate in daily scrums
- Record progress daily in assigned Devops items
Ideal Candidates should have
- 5 + years of experience in a similar senior business intelligence development position
- To be successful in the role you will require a high level of expertise across all facets of the Microsoft BI stack and prior experience in designing and developing well-performing data warehouse solutions
- Demonstrated experience using development tools such as Azure SQL database, Azure Data Factory, Azure Data Lake, Azure Synapse, and Azure DevOps.
- Experience with development methodologies including Agile, DevOps, and CICD patterns
- Strong oral and written communication skills in English
- Ability and willingness to learn quickly and continuously
- Bachelor's Degree in computer science
at Altimetrik
-Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight.
-Experience in developing lambda functions with AWS Lambda.
-
Expertise with Spark/PySpark
– Candidate should be hands on with PySpark code and should be able to do transformations with Spark
-Should be able to code in Python and Scala.
-
Snowflake experience will be a plus
at Altimetrik
Big data Developer
Exp: 3yrs to 7 yrs.
Job Location: Hyderabad
Notice: Immediate / within 30 days
1. Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight
2. Experience in developing lambda functions with AWS Lambda
3. Expertise with Spark/PySpark Candidate should be hands on with PySpark code and should be able to do transformations with Spark
4. Should be able to code in Python and Scala.
5. Snowflake experience will be a plus
We can start keeping Hadoop and Hive requirements as good to have or understanding of is enough rather than keeping it as a desirable requirement.
at Fragma Data Systems
Skills- Informatica with Big Data Management
1.Minimum 6 to 8 years of experience in informatica BDM development
2.Experience working on Spark/SQL
3.Develops informtica mapping/Sql
- Desire to explore new technology and break new ground.
- Are passionate about Open Source technology, continuous learning, and innovation.
- Have the problem-solving skills, grit, and commitment to complete challenging work assignments and meet deadlines.
Qualifications
- Engineer enterprise-class, large-scale deployments, and deliver Cloud-based Serverless solutions to our customers.
- You will work in a fast-paced environment with leading microservice and cloud technologies, and continue to develop your all-around technical skills.
- Participate in code reviews and provide meaningful feedback to other team members.
- Create technical documentation.
- Develop thorough Unit Tests to ensure code quality.
Skills and Experience
- Advanced skills in troubleshooting and tuning AWS Lambda functions developed with Java and/or Python.
- Experience with event-driven architecture design patterns and practices
- Experience in database design and architecture principles and strong SQL abilities
- Message brokers like Kafka and Kinesis
- Experience with Hadoop, Hive, and Spark (either PySpark or Scala)
- Demonstrated experience owning enterprise-class applications and delivering highly available distributed, fault-tolerant, globally accessible services at scale.
- Good understanding of distributed systems.
- Candidates will be self-motivated and display initiative, ownership, and flexibility.
Preferred Qualifications
- AWS Lambda function development experience with Java and/or Python.
- Lambda triggers such as SNS, SES, or cron.
- Databricks
- Cloud development experience with AWS services, including:
- IAM
- S3
- EC2
- AWS CLI
- API Gateway
- ECR
- CloudWatch
- Glue
- Kinesis
- DynamoDB
- Java 8 or higher
- ETL data pipeline building
- Data Lake Experience
- Python
- Docker
- MongoDB or similar NoSQL DB.
- Relational Databases (e.g., MySQL, PostgreSQL, Oracle, etc.).
- Gradle and/or Maven.
- JUnit
- Git
- Scrum
- Experience with Unix and/or macOS.
- Immediate Joiners
Nice to have:
- AWS / GCP / Azure Certification.
- Cloud development experience with Google Cloud or Azure
Job Description
Want to make every line of code count? Tired of being a small cog in a big machine? Like a fast-paced environment where stuff get DONE? Wanna grow with a fast-growing company (both career and compensation)? Like to wear different hats? Join ThinkDeeply in our mission to create and apply Enterprise-Grade AI for all types of applications.
Seeking an M.L. Engineer with high aptitude toward development. Will also consider coders with high aptitude in M.L. Years of experience is important but we are also looking for interest and aptitude. As part of the early engineering team, you will have a chance to make a measurable impact in future of Thinkdeeply as well as having a significant amount of responsibility.
Experience
10+ Years
Location
Bozeman/Hyderabad
Skills
Required Skills:
Bachelors/Masters or Phd in Computer Science or related industry experience
3+ years of Industry Experience in Deep Learning Frameworks in PyTorch or TensorFlow
7+ Years of industry experience in scripting languages such as Python, R.
7+ years in software development doing at least some level of Researching / POCs, Prototyping, Productizing, Process improvement, Large-data processing / performance computing
Familiar with non-neural network methods such as Bayesian, SVM, Adaboost, Random Forests etc
Some experience in setting up large scale training data pipelines.
Some experience in using Cloud services such as AWS, GCP, Azure
Desired Skills:
Experience in building deep learning models for Computer Vision and Natural Language Processing domains
Experience in productionizing/serving machine learning in industry setting
Understand the principles of developing cloud native applications
Responsibilities
Collect, Organize and Process data pipelines for developing ML models
Research and develop novel prototypes for customers
Train, implement and evaluate shippable machine learning models
Deploy and iterate improvements of ML Models through feedback
Required skill
- Around 6- 8.5 years of experience and around 4+ years in AI / Machine learning space
- Extensive experience in designing large scale machine learning solution for the ML use case, large scale deployments and establishing continues automated improvement / retraining framework.
- Strong experience in Python and Java is required.
- Hands on experience on Scikit-learn, Pandas, NLTK
- Experience in Handling of Timeseries data and associated techniques like Prophet, LSTM
- Experience in Regression, Clustering, classification algorithms
- Extensive experience in buildings traditional Machine Learning SVM, XGBoost, Decision tree and Deep Neural Network models like RNN, Feedforward is required.
- Experience in AutoML like TPOT or other
- Must have strong hands on experience in Deep learning frameworks like Keras, TensorFlow or PyTorch
- Knowledge of Capsule Network or reinforcement learning, SageMaker is a desirable skill
- Understanding of Financial domain is desirable skill
Responsibilities
- Design and implementation of solutions for ML Use cases
- Productionize System and Maintain those
- Lead and implement data acquisition process for ML work
- Learn new methods and model quickly and utilize those in solving use cases