Metadata-Version: 2.3
Name: kavari
Version: 0.1.0
Summary: Typed Python Kafka Manager - easy publish/listen to kafka topics embraced with strong types
License: Apache 2.0
Author: Szymon Dudziak
Author-email:  sdudziak@users.noreply.github.com
Requires-Python: >=3.9
Classifier: License :: Other/Proprietary License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Requires-Dist: confluent-kafka (>=2.10.0,<3.0.0)
Requires-Dist: dacite (>=1.9.2,<2.0.0)
Description-Content-Type: text/markdown

Kavari: easy, automated Kafka publish/subscription with strong types
---

This tool is to make usage of kafka super easy and safe, 
utilizing best practices and power given by [`confluent_kafka`](https://github.com/confluentinc/confluent-kafka-python)

## Publishing message

Create a message type, that defines the payload (our strong typed message format)
```python

class TestKafkaMessage(KafkaMessage):
    topic = "test_topic"

    def __init__(self, payload: str):
        super().__init__()
        self.payload: str = payload

    def get_partition_key(self) -> str:
        return "1"
```

And then publish it on the topic, just by calling:
```python
msg: TestKafkaMessage = TestKafkaMessage("test_message")
kafka_manager.publish_message(msg, lambda msg, ex:  print("Message published"))
```
Easy? I hope so! Now let's consume this message

## Consuming message

Define the handler class 
```python
@kafka_message_handler(message_cls=TestKafkaMessage)
class TestKafkaMessageConsumer(KafkaMessageConsumer):

    def __init__(self):
        self.received_message: str | None = None

    def handle(self, message_data: str) -> None:
        self.received_message = message_data
```

That's (almost) it! 
Once consumer become available via provider (any DI for example) each message is handled out of the box

I hope you like the concept!

To achieve full power of this lib, you need to configure it

## Configuration

The example one, compatible with DI container:

```python
from kavari import kavari_create, FibonacciRetryPolicy, KafkaManager


class Container(DeclarativeContainer):
    kafka_manager: Singleton[KafkaManager] = Singleton(
        lambda: kavari_create(
            bootstrap_servers="bootstrap_location:2973",
            group_id="unique_group_identifier",
            publishing_retry_policy=FibonacciRetryPolicy(max_attempts=10),
            logger=logger,
            auto_commit=False,
            auto_offset_reset="earliest"
        )
    )
```

Then in the bootstrap of the project add (here, with the FastAPI):
```python

@asynccontextmanager
async def lifespan(app: FastAPI):
    # this part is called on application start
    container.logger().info("Initiating startup & background jobs")
    # consumer provider is called to get particular type of the consumer, making 
    # the autoresolve feature working out of the box
    container.kafka_manager().set_consumer_provider(container.resolve) 
    container.kafka_manager().start_consumer_loop()
    yield
    # this part is called when application is tearing down
    container.logger().info("Stopping background jobs")
    container.kafka_manager().stop_consumer_loop()
```

---
