Requirements

Tooling

  1. pytest + Python Kubernetes Client
  2. Kubernetes cluster
  3. Strimzi Kafka
  4. Zookeeper
  5. Kafka Rest Proxy
  6. Kafka Connect
  7. Schema Registry
  8. (Optional) Redpanda, Streams Explorer

Test scenarios

We can use the already defined pipeline YAMLs for testing the generate command.

Dry-run

We can trigger a dry run before we execute the deployment. Not only will it test the functionality of the dry run, but it will ensure that the infrastructure is up and running.

Execute

  1. Deploy:
    1. Kafka Apps: We deploy the Kubernetes resources (Producer, Streams apps) and then assess the (deployments, pods, jobs, etc.).
    2. Kafka Connect: we can assess the applied configuration (retrieve it with the Connect REST API) with the defined configuration.
    3. Topics: (for now) we can use the Kafka REST Proxy to see if the topic is created. Later, when we switch to Streamzi topics, this can be done with the Kubernetes client.
  2. Destroy (does not have a high priority):
    1. Kafka Apps: All the Kubernetes resources should be deleted. We should check if the consumer group still exists.
    2. Kafka Connect: Check if the configured connector does not exist. Consumer group and offset topic should be there.
    3. Topics: Should be untouched.
  3. Reset:
    1. Kafka Apps: All the Kubernetes resources should be deleted. We should check if the consumer group is deleted.
    2. Kafka Connect: Check if the configured connector does not exist. Consumer group and offset topic should be deleted.
    3. Topics: Should be untouched.
  4. Clean
    1. Kafka Apps: All the Kubernetes resources should be deleted. We should check if the consumer group is deleted.
    2. Kafka Connect: Check if the configured connector does not exist. Consumer group and offset topic should be deleted.
    3. (Output) Topics: Should be deleted.

Trigger & Run