QA Manager

Slava Meyerzon

QA Manager at Superstream, managing testing for enterprise Kafka optimization platform across multiple cloud environments. Hands-on Java test automation development (60% of time) using Selenium WebDriver for UI testing alongside functional, integration, regression, E2E, performance, load, and API testing. Working with distributed systems on AWS MSK, Confluent Cloud, and Aiven.

Java Expert

Selenium WebDriver

REST API Testing

Core Technical Competencies

Java Development

Expert-level Java programming with 10+ years experience. TestNG, JUnit, Maven, Gradle. Building custom automation frameworks and utilities.

🔧

Selenium Automation

Advanced Selenium WebDriver implementation. Page Object Model, Data-Driven Testing, Cross-browser testing, Grid configuration.

🔌

API Testing

REST API automation with Rest-Assured. JSON/XML validation, OAuth 2.0, Performance testing with JMeter/BlazeMeter.

⚙️

CI/CD Pipeline Management

Jenkins pipeline configuration for automated test execution. GitHub Actions integration, Docker containerization for test environments. Build automation using Maven/Gradle.

👥

Team Leadership & Process

Managing QA teams up to 8 engineers. Implementing Agile/Scrum methodologies. KPI tracking with Jira, HP QC, TestRail. Test strategy development and risk assessment.

📊

Performance Testing

JMeter and BlazeMeter for load testing. Simulating 10K+ concurrent users. Performance bottleneck analysis. Integration with monitoring tools for real-time metrics.

☁️

Cloud & Infrastructure

AWS, Docker, Kubernetes experience. Distributed systems testing. Cloud-native application testing. Container orchestration for test environments.

🗄️

Database & Backend Testing

SQL queries optimization, PostgreSQL, MySQL. Data validation and integrity testing. Kafka messaging system testing. Backend service integration validation.

🤖

AI-Powered Testing

Prompt engineering for test generation. LLM integration for test automation. GPT/Claude models for intelligent test scenario creation and root cause analysis.

Enterprise Projects Portfolio

Kafka Optimization Platform

Superstream

Technical Challenge:

Enterprises using Apache Kafka face escalating cloud costs (up to 40% of infrastructure spend), with teams manually managing broker configurations, idle resources, and client tuning. Superstream solves this by providing autonomous Kafka optimization - detecting misconfigurations, removing idle topics/partitions/consumer groups, and applying zero-code client optimizations to reduce costs by up to 60% while improving stability.

My Role:

  • • Manage system testing domain for complex Kafka optimization platform with team of 3 QA engineers
  • • Plan and execute comprehensive tests on distributed systems (60% hands-on involvement)
  • • Develop test plans for advanced technology stack including AWS MSK, Confluent Cloud, and Aiven
  • • Perform functional, integration, performance, and regression testing across multiple environments
  • • Identify, resolve, and document defects encountered during testing cycles
  • • Collaborate closely with development and engineering teams to improve system quality and integrity
  • • Implement test automation processes where applicable using Java, TestNG, and Rest-Assured
  • • Train and professionally manage the testing team, providing mentorship and technical guidance
  • • Generate detailed reports on testing results with actionable recommendations for improvement
  • • Drive process improvements using Agile methodologies and DevOps frameworks
  • • Spearheaded adoption of AI-powered quality assurance utilizing GPT/Claude-based models for test case generation, natural language test specifications, and automated root cause analysis of failures

Visual Agentic AI platform

TechSee

Technical Challenge:

Enterprise-scale Visual Agentic AI platform serving millions of users with real-time computer vision, augmented reality overlay capabilities, and intelligent video analysis. TechSee's AI agents autonomously process live video streams to provide contextual guidance, object recognition, and automated troubleshooting across industries like telecommunications, insurance, and field services. The platform must handle complex scenarios: simultaneous multi-user video sessions, real-time AR annotation rendering, machine learning model inference at scale, and seamless integration with enterprise systems while maintaining sub-second latency for critical visual assistance workflows.

My Role:

  • • Lead Software Automation Engineer with full ownership of E2E testing for Visual Agentic AI platform
  • • Built complete automation infrastructure from scratch using Java, Selenium, TestNG, JavaScript, Python
  • • Managed team of 7 QA engineers - conducted 1-on-1s, performance reviews, skill development programs
  • • Direct collaboration with product, R&D, DevOps teams for architecture decisions and test strategy
  • • Client-facing responsibilities: conducted demos, gathered requirements, presented testing metrics to stakeholders
  • • Designed and implemented CI/CD pipeline with Jenkins - automated test execution on every PR
  • • Created comprehensive test coverage: functional, integration, regression, performance
  • • Cross-functional leadership: participated in sprint planning, architecture reviews, release planning
  • • Established testing best practices, documentation standards, knowledge sharing sessions
  • • Mentored team members in test automation, conducted technical interviews for new hires
  • • Established AI-driven testing practices using prompt engineering and LLM integration for automated test scenario generation, intelligent test data creation, and anomaly detection in test results

VPaaS solution

Kaltura

Technical Challenge:

Enterprise-grade Video Platform as a Service (VPaaS) powering OTT media delivery for millions of subscribers with integrated financial services and payment processing. Kaltura's platform handles complex video workflows including live streaming, on-demand content delivery, transcoding pipelines, and DRM protection while simultaneously managing subscription billing, payment gateway integrations, and regulatory compliance (PCI DSS, GDPR). The system must support peak traffic scenarios like Black Friday with 100K+ concurrent users, maintain 99.9% uptime SLA, ensure secure financial transactions, and deliver adaptive streaming across multiple devices and geographical regions with CDN optimization for global content distribution.

My Role:

  • • Led team of 8 QA engineers for large-scale VPaaS solution testing
  • • Managed complete E2E testing process for complex video platform system
  • • Developed comprehensive test plans covering unit, functional, regression, integration, and API testing
  • • Executed UAT (User Acceptance Testing) with enterprise clients
  • • Implemented performance testing with BlazeMeter and JMeter for high traffic scenarios
  • • Set up robust testing system with automated tests integrated in CI/CD pipelines
  • • Managed KPIs for QA team using HP QC and Jira for test management
  • • Reported on testing progress and results to stakeholders with improvement recommendations
  • • Promoted collaboration between QA, development, and engineering teams
  • • Applied Agile methodologies in sprint planning and test execution

Financial app for mobile phone

Jan 2016 - Aug 2017

OnO Apps • Bnei Brak

My Role:

  • • Performed comprehensive tests and provided support for financial application on iOS and Android platforms at Leumi Card
  • • Wrote test scenarios encompassing client and server components, including STD, STP, mobile, and database tests
  • • Conducted thorough testing of user experience (UX) and performance across various devices, including extreme cases
  • • Utilized testing and management tools including Xcode, Android Studio, HP QC, MTM, and Jira throughout testing lifecycle
  • • Operated within MF environment to enhance technical proficiency in complex systems
  • • Collaborated directly with project managers, feature writers, and marketing department in cross-functional teams
  • • Ensured alignment between technical aspects and business objectives through active contribution
  • • Developed deep understanding of the financial application and interconnected supporting systems
  • • Provided valuable insights and drove testing efforts effectively based on comprehensive system knowledge
  • • Delivered thorough testing and support contributing to success of Leumi Card's mobile application

Delivery company management app

Jan 2015 - Jan 2016

Buzzr • Rosh HaAyin

My Role:

  • • Worked as dedicated back-end QA Engineer with meticulous approach to ensuring software reliability and performance
  • • Applied computer science background and hands-on experience in agile environments
  • • Developed comprehensive test strategies and identified elusive defects in backend systems
  • • Utilized automated testing tools, SQL, and scripting to optimize testing processes
  • • Enhanced product quality effectively through technical expertise and testing optimization
  • • Collaborated with cross-functional teams to drive continuous improvement in backend systems
  • • Contributed to innovative projects leveraging QA engineering skills
  • • Continuously learned new technologies and methodologies to stay current in QA engineering

Technical Arsenal

Hover over skills to see proficiency level

Test Management Tools

Jira HP QC MTM TestRail

Testing Methodologies

Functional Integration Performance Regression UAT E2E API Testing AI Driven

Performance Testing

JMeter BlazeMeter Load Testing

Development Methodologies

Agile Scrum

CI/CD & Automation

Jenkins GitHub Actions Test Automation Frameworks

Programming Languages

Java JavaScript Groovy GO Python

Testing Frameworks

Selenium TestNG JUnit Rest-Assured Playwright Pytest Cypress

Cloud & Infrastructure

AWS Docker Kubernetes GCP

Databases & Messaging

Apache Kafka Microservices PostgreSQL Microsoft SQL Server SQL RabbitMQ Redis NATS

Technical Profile

Contact Information

Technical Resources

Available immediately

Continuous Learning Approach

Hebrew, English, Russian