kafka-rs

๐Ÿฆ€ Kafka-RS: Educational Kafka Implementation in Rust

Rust License: MIT Rust Version

An educational Apache Kafka-compatible message broker implementation in Rust, designed for learning how Kafka works under the hood. Built with Domain-Driven Design (DDD) principles to demonstrate clean architecture patterns in systems programming.

๐ŸŽฏ Purpose

This project serves as an educational resource for understanding:

โš ๏ธ Educational Use Only: This implementation prioritizes learning and clarity over production performance. It uses in-memory storage and simplified algorithms.

โœจ Features

๐Ÿ”Œ Kafka Protocol Compatibility

๐Ÿ—๏ธ Domain-Driven Design Architecture

๐Ÿ“š Educational Focus

๐Ÿš€ Quick Start

Prerequisites

Installation & Running

# Clone the repository
git clone https://github.com/elsonwu/kafka-rs.git
cd kafka-rs

# Run the server (default port 9092)
cargo run --bin kafka-rs

# Or with custom port
cargo run --bin kafka-rs -- --port 9093

# Run tests
cargo test

# Run with logging
RUST_LOG=info cargo run --bin kafka-rs

The server will start and listen on localhost:9092 (or your specified port).

๐Ÿงช Testing with KafkaJS

const { Kafka } = require('kafkajs');

const kafka = new Kafka({
  clientId: 'test-app',
  brokers: ['localhost:9092']
});

const producer = kafka.producer();
const consumer = kafka.consumer({ groupId: 'test-group' });

async function example() {
  // Connect
  await producer.connect();
  await consumer.connect();
  
  // Produce messages
  await producer.send({
    topic: 'test-topic',
    messages: [
      { key: 'key1', value: 'Hello Kafka-RS!' },
      { key: 'key2', value: 'Educational implementation' }
    ]
  });
  
  // Consume messages
  await consumer.subscribe({ topic: 'test-topic' });
  await consumer.run({
    eachMessage: async ({ partition, message }) => {
      console.log({
        partition,
        offset: message.offset,
        key: message.key?.toString(),
        value: message.value?.toString()
      });
    }
  });
}

example().catch(console.error);

๐Ÿ“– Documentation

Comprehensive documentation is available in the docs/ folder:

Core Documentation

Deep Dive Documentation

These deep dive documents capture the intricate details of Kafkaโ€™s operation that you can only discover by implementing a compatible system from scratch.

๐Ÿ—๏ธ Architecture

graph TB
    subgraph "Infrastructure Layer"
        A[TCP Server] --> B[Protocol Handler]
        B --> C[Message Encoding/Decoding]
    end
    
    subgraph "Application Layer"
        D[Use Cases] --> E[DTOs]
    end
    
    subgraph "Domain Layer"
        F[Entities] --> G[Value Objects]
        G --> H[Domain Services]
        H --> I[Repository Interfaces]
    end
    
    A --> D
    D --> F

Domain Layer (src/domain/)

Application Layer (src/application/)

Infrastructure Layer (src/infrastructure/)

๐Ÿงช Testing

# Run all Rust tests
cargo test

# Run specific test suite
cargo test --test integration_tests

# Run with output
cargo test -- --nocapture

# Run benchmarks (if available)
cargo bench

Kafka Client Integration Test

This project includes a comprehensive integration test using real Kafka JavaScript clients to verify protocol compatibility:

# Prerequisites: Node.js 18+ required
cd integration/kafka-client-test
npm install

# Start the Kafka server (in another terminal)
cargo run --release -- --port 9092

# Run the integration test
npm test

The integration test:

Test Coverage

๐Ÿ› ๏ธ Development

Quick Development Commands

We provide a Makefile with common development tasks:

# Show all available commands
make help

# Format code (fixes formatting issues automatically)
make format

# Check formatting without making changes  
make check

# Run linting with clippy
make lint

# Run all tests
make test

# Build the project
make build

# Build in release mode
make build-release

# Start the Kafka server
make server

# Fix formatting and linting issues
make fix

# Run integration tests with real Kafka clients
make integration-test

# Run pre-commit checks (format + lint + test)
make pre-commit

# Run CI checks locally
make ci

Development Workflow

  1. Before making changes:

    make ci  # Ensure everything is working
    
  2. After making changes:

    make fix      # Auto-fix formatting and linting
    make test     # Run tests
    make ci       # Final check before commit
    
  3. Before committing:

    make pre-commit  # Comprehensive pre-commit checks
    

Formatting Configuration

The project uses rustfmt.toml for consistent code formatting. The CI will automatically check formatting and provide helpful error messages if issues are found.

Project Structure

kafka-rs/
โ”œโ”€โ”€ src/
โ”‚   โ”œโ”€โ”€ domain/          # Business logic layer
โ”‚   โ”œโ”€โ”€ application/     # Use case orchestration
โ”‚   โ”œโ”€โ”€ infrastructure/  # I/O and external systems
โ”‚   โ”œโ”€โ”€ lib.rs          # Library interface
โ”‚   โ””โ”€โ”€ main.rs         # Binary entry point
โ”œโ”€โ”€ docs/               # Comprehensive documentation
โ”œโ”€โ”€ tests/              # Rust integration tests
โ”œโ”€โ”€ integration/        # External client integration tests
โ”‚   โ””โ”€โ”€ kafka-client-test/  # KafkaJS compatibility tests
โ””โ”€โ”€ .github/workflows/  # CI/CD pipeline

Contributing

This is an educational project, but contributions are welcome:

  1. Fork the repository
  2. Create a feature branch with descriptive name (e.g., feat/consumer-groups)
  3. Make your changes with tests and documentation
  4. Use conventional commits (e.g., feat:, fix:, chore:, docs:)
  5. Run make pre-commit to ensure code quality
  6. Submit a pull request with conventional commit title

Conventional Commit Examples:

Design Principles

๐Ÿ“‹ Supported Kafka APIs

API Key Status Description
Produce 0 โœ… Send messages to topics
Fetch 1 โœ… Consume messages from topics
Metadata 3 โœ… Discover topics and brokers
OffsetCommit 8 โœ… Commit consumer offsets
OffsetFetch 9 โœ… Fetch committed offsets

Limitations (Educational Simplifications)

๐Ÿ“Š Performance

While not optimized for production use, the implementation can handle:

๐Ÿ” Monitoring & Debugging

# Enable debug logging
RUST_LOG=debug cargo run --bin kafka-rs

# Enable trace logging (verbose)
RUST_LOG=trace cargo run --bin kafka-rs

# JSON structured logging
RUST_LOG=info cargo run --bin kafka-rs 2>&1 | jq

๐Ÿค Community & Learning

This project is part of learning Rust systems programming and distributed systems concepts:

๐Ÿ“œ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ™ Acknowledgments


โญ Star this repo if you find it useful for learning Kafka internals or Rust systems programming!

๐Ÿ“š Learning Resources

Built with โค๏ธ and ๐Ÿฆ€ for educational purposes.