An educational Apache Kafka-compatible message broker implementation in Rust, designed for learning how Kafka works under the hood. Built with Domain-Driven Design (DDD) principles to demonstrate clean architecture patterns in systems programming.
This project serves as an educational resource for understanding:
โ ๏ธ Educational Use Only: This implementation prioritizes learning and clarity over production performance. It uses in-memory storage and simplified algorithms.
# Clone the repository
git clone https://github.com/elsonwu/kafka-rs.git
cd kafka-rs
# Run the server (default port 9092)
cargo run --bin kafka-rs
# Or with custom port
cargo run --bin kafka-rs -- --port 9093
# Run tests
cargo test
# Run with logging
RUST_LOG=info cargo run --bin kafka-rs
The server will start and listen on localhost:9092
(or your specified port).
const { Kafka } = require('kafkajs');
const kafka = new Kafka({
clientId: 'test-app',
brokers: ['localhost:9092']
});
const producer = kafka.producer();
const consumer = kafka.consumer({ groupId: 'test-group' });
async function example() {
// Connect
await producer.connect();
await consumer.connect();
// Produce messages
await producer.send({
topic: 'test-topic',
messages: [
{ key: 'key1', value: 'Hello Kafka-RS!' },
{ key: 'key2', value: 'Educational implementation' }
]
});
// Consume messages
await consumer.subscribe({ topic: 'test-topic' });
await consumer.run({
eachMessage: async ({ partition, message }) => {
console.log({
partition,
offset: message.offset,
key: message.key?.toString(),
value: message.value?.toString()
});
}
});
}
example().catch(console.error);
Comprehensive documentation is available in the docs/
folder:
These deep dive documents capture the intricate details of Kafkaโs operation that you can only discover by implementing a compatible system from scratch.
graph TB
subgraph "Infrastructure Layer"
A[TCP Server] --> B[Protocol Handler]
B --> C[Message Encoding/Decoding]
end
subgraph "Application Layer"
D[Use Cases] --> E[DTOs]
end
subgraph "Domain Layer"
F[Entities] --> G[Value Objects]
G --> H[Domain Services]
H --> I[Repository Interfaces]
end
A --> D
D --> F
src/domain/
)src/application/
)src/infrastructure/
)# Run all Rust tests
cargo test
# Run specific test suite
cargo test --test integration_tests
# Run with output
cargo test -- --nocapture
# Run benchmarks (if available)
cargo bench
This project includes a comprehensive integration test using real Kafka JavaScript clients to verify protocol compatibility:
# Prerequisites: Node.js 18+ required
cd integration/kafka-client-test
npm install
# Start the Kafka server (in another terminal)
cargo run --release -- --port 9092
# Run the integration test
npm test
The integration test:
We provide a Makefile
with common development tasks:
# Show all available commands
make help
# Format code (fixes formatting issues automatically)
make format
# Check formatting without making changes
make check
# Run linting with clippy
make lint
# Run all tests
make test
# Build the project
make build
# Build in release mode
make build-release
# Start the Kafka server
make server
# Fix formatting and linting issues
make fix
# Run integration tests with real Kafka clients
make integration-test
# Run pre-commit checks (format + lint + test)
make pre-commit
# Run CI checks locally
make ci
Before making changes:
make ci # Ensure everything is working
After making changes:
make fix # Auto-fix formatting and linting
make test # Run tests
make ci # Final check before commit
Before committing:
make pre-commit # Comprehensive pre-commit checks
The project uses rustfmt.toml
for consistent code formatting. The CI will automatically check formatting and provide helpful error messages if issues are found.
kafka-rs/
โโโ src/
โ โโโ domain/ # Business logic layer
โ โโโ application/ # Use case orchestration
โ โโโ infrastructure/ # I/O and external systems
โ โโโ lib.rs # Library interface
โ โโโ main.rs # Binary entry point
โโโ docs/ # Comprehensive documentation
โโโ tests/ # Rust integration tests
โโโ integration/ # External client integration tests
โ โโโ kafka-client-test/ # KafkaJS compatibility tests
โโโ .github/workflows/ # CI/CD pipeline
This is an educational project, but contributions are welcome:
feat/consumer-groups
)feat:
, fix:
, chore:
, docs:
)make pre-commit
to ensure code qualityConventional Commit Examples:
feat: add consumer group coordination
fix: resolve metadata response parsing issue
chore: update CI workflow configuration
docs: improve getting started guide
API | Key | Status | Description |
---|---|---|---|
Produce | 0 | โ | Send messages to topics |
Fetch | 1 | โ | Consume messages from topics |
Metadata | 3 | โ | Discover topics and brokers |
OffsetCommit | 8 | โ | Commit consumer offsets |
OffsetFetch | 9 | โ | Fetch committed offsets |
While not optimized for production use, the implementation can handle:
# Enable debug logging
RUST_LOG=debug cargo run --bin kafka-rs
# Enable trace logging (verbose)
RUST_LOG=trace cargo run --bin kafka-rs
# JSON structured logging
RUST_LOG=info cargo run --bin kafka-rs 2>&1 | jq
This project is part of learning Rust systems programming and distributed systems concepts:
This project is licensed under the MIT License - see the LICENSE file for details.
โญ Star this repo if you find it useful for learning Kafka internals or Rust systems programming!
Built with โค๏ธ and ๐ฆ for educational purposes.