Find this one 👀 specific record
KaDeck contains a variety of data analysis features to help you find the exact records you are looking for.
- Filtering using data object attributes
- Search by time and time windows
- Search for key and values
Data routing & export
Select records and forward them to a stream, or export them as a table.
- Correct erroneous data sets
- Send data sets to a stream
- Export data sets, e.g. as CSV
Data transformation 💪
- Stateful calculations supported (e.g., averages, grouping)
- Runs in real-time
Create & share data reports
Views allow you to save and share individual data extracts and create reports based on filtering criteria and transformations.
- Create a report that shows your critical data sets.
- Easily access the report on a daily basis or share it with your colleagues.
- Also ideal for giving non-technical users customized views into the data.
- Spreadsheet-like UI with columns, sorting, and filtering.
- Export as CSV if desired.
Codecs & auto-detection
KaDeck contains multiple codecs for key and values, as well as the headers of your records. Our auto-detection mechanism lets you directly dive into your topics without configuring codecs in advance.
For consumption and ingestion:
- JSON Codec
- Avro Codec
- String Codec
- Integer Codec
- Float Codec
- Long Codec
- Double Codec
For consumption only:
- Avro Embedded Codec
- Avro Topic Record Strategy Codec¹
- Avro Record Strategy Codec¹
- CSV Codec for single and multi-line CSV
You can implement your own custom codecs to add support for custom data types to KaDeck.
¹ This codec supports “subject naming strategy”, which doesn’t constrain the records in your topics to only one Avro schema per key and value, but lets you use multiple Avro schemas in one topic.
Connectivity and security 📡
KaDeck is capable of managing multiple environments and supports Apache Kafka and Amazon Kinesis.
Quickly ingest one or multiple records either using our built-in and or your own custom codecs. Add record headers with types, use existing records as templates or start from scratch. Use our convenient user interface or switch to the JSON view if you are more the text manipulation type.
- Ingest multiple records at once
- Send records from one topic to another
- Support for record headers
- Intuitive UI and additional JSON view
- Use existing records as templates
- Many supported codecs
- Extendable using own custom codecs
Manage your Kafka Connect clusters
Add new data sources or sinks in no-time. KaDeck lets you easily manage and monitor your Kafka Connect connectors.
- Add new connectors
- Manage your connectors state
- Edit configurations of connectors
¹ Only available for KaDeck Web. Learn more.
KaDeck comes with many more features such as an embedded Kafka cluster for quickly testing and developing applications on your local machine, exporting single and multiple records and much more.
- Embedded Apache Kafka broker
- Schema Registry management
- ACL management
- Consumer offset management
- FlowView enables end-to-end monitoring of your data flow
- Create and delete topics
- Export single and multiple records
- Test Java applications with live data utilizing our Codecs API
- Seek to offsets across partitions
- Seek to offsets by timestamp
- And much more.
“I never got a local cluster up & running as quick as with KaDeck. The growing number of features for common tasks enables me to focus on the core of my work, reducing the hassle.” — David Weber, Jeed UG