Data analysis and filters
KaDeck contains a variety of data analysis features to help you find the exact records you are looking for.
- Filtering using data object attributes
- Transform records on the fly
- Search by time and time windows
- Search for key and values
Data routing & export
Select records and forward them to a stream, or export them as a table.
- Correct erroneous data sets
- Send data sets to a stream
- Export data sets, e.g. as CSV
Transform and filter records in real-time with the Quick Processor
- Stateful calculations supported (e.g., averages, grouping)
- Runs in real-time
Find relevant data faster
Views allow you to save and share individual data extracts and create reports that include current records based on filtering criteria and transformations.
- Save your individual transformations, filters and view settings
- Share them with your team
- Add attributes as columns in your reports
- Sort records by attributes
Codecs & auto-detection
KaDeck contains multiple codecs for key and values, as well as the headers of your records. Our auto-detection mechanism lets you directly dive into your topics without configuring codecs in advance.
For consumption and ingestion:
- JSON Codec
- Avro Codec
- String Codec
- Integer Codec
- Float Codec
- Long Codec
- Double Codec
For consumption only:
- Avro Embedded Codec
- Avro Topic Record Strategy Codec¹
- Avro Record Strategy Codec¹
- CSV Codec for single and multi-line CSV
You can implement your own custom codecs to add support for custom data types to KaDeck.
¹ This codec supports “subject naming strategy”, which doesn’t constrain the records in your topics to only one Avro schema per key and value, but lets you use multiple Avro schemas in one topic.
Connectivity and security
KaDeck is capable of managing multiple environments and supports Apache Kafka and Amazon Kinesis.
Quickly ingest one or multiple records either using our built-in and or your own custom codecs. Add record headers with types, use existing records as templates or start from scratch. Use our convenient user interface or switch to the JSON view if you are more the text manipulation type.
- Ingest multiple records at once
- Send records from one topic to another
- Support for record headers
- Intuitive UI and additional JSON view
- Use existing records as templates
- Many supported codecs
- Extendable using own custom codecs
KaDeck comes with many more features such as an embedded Kafka cluster for quickly testing and developing applications on your local machine, exporting single and multiple records and much more.
“I never got a local cluster up & running as quick as with KaDeck. The growing number of features for common tasks enables me to focus on the core of my work, reducing the hassle.” — David Weber, Jeed UG