Skip to content

Commit

Permalink
Prep for v0.13.0
Browse files Browse the repository at this point in the history
  • Loading branch information
horkhe committed Mar 23, 2017
1 parent 5b8ef73 commit 06b53fd
Show file tree
Hide file tree
Showing 8 changed files with 117 additions and 173 deletions.
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Changelog

#### Version 0.13.0 (TBD)
#### Version 0.13.0 (2017-03-22)

Implemented:
* At-Least-Once delivery guarantee via synchronous production and
Expand Down
103 changes: 15 additions & 88 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,19 @@ with automatic consumer group control. It is designed to hide the
complexity of the Kafka client protocol and provide a stupid simple
API that is trivial to implement in any language.

Kafka-Pixy supports Kafka versions form **0.8.2.x** to **0.10.1.x**. It uses the
Kafka [Offset Commit/Fetch API](https://cwiki.apache.org/confluence/display/KAFKA/A+Guide+To+The+Kafka+Protocol#AGuideToTheKafkaProtocol-OffsetCommit/FetchAPI)
to keep track of consumer offsets and ZooKeeper to manage distribution
of partitions among consumer group members.

You can jump to [Quick Start](README.md#quick-start) if you are anxious
to give it a try.
Kafka-Pixy supports Kafka versions form **0.8.2.x** to **0.10.1.x**. It uses
the Kafka [Offset Commit/Fetch API](https://cwiki.apache.org/confluence/display/KAFKA/A+Guide+To+The+Kafka+Protocol#AGuideToTheKafkaProtocol-OffsetCommit/FetchAPI)
to keep track of consumer offsets. However [Group Membership API](https://cwiki.apache.org/confluence/display/KAFKA/A+Guide+To+The+Kafka+Protocol#AGuideToTheKafkaProtocol-GroupMembershipAPI)
is not yet implemented, therefore it needs to talk to Zookeeper directly to
manage consumer group membership.

If you are anxious get started the jump to [How-to Install](howto-install.md)
and then proceed with a quick start guide for your weapon of choice:
[Curl](quick-start-curl.md), [Python](quick-start-python.md), or [Golang](quick-start-golang.md).
If you want to use some other language, then you still can use either of the
guides for inspiration, but you would need to generate gRPC client stubs
from [grpc.proto](grpc.proto) yourself (please refer to [gRPC documentation](http://www.grpc.io/docs/)
for details).

#### Key Features:

Expand Down Expand Up @@ -184,7 +190,7 @@ e.g.:
"key": "0JzQsNGA0YPRgdGP",
"value": "0JzQvtGPINC70Y7QsdC40LzQsNGPINC00L7Rh9C10L3RjNC60LA=",
"partition": 0,
"offset": 13}
"offset": 13
}
```

Expand Down Expand Up @@ -282,7 +288,7 @@ GET /clusters/<topic>/topics/<topic>/consumers
Returns a list of consumers that are subscribed to a topic.

Parameter | Opt | Description
-----------|-----|------------------------------------------------------
-----------|-----|------------------------------------------------
cluster | yes | The name of a cluster to operate on. By default the cluster mentioned first in the `proxies` section of the config file is used.
topic | | The name of a topic to produce to.
group | yes | The name of a consumer group. By default returns data for all known consumer groups subscribed to the topic.
Expand Down Expand Up @@ -356,85 +362,6 @@ Command line parameters that Kafka-Pixy accepts are listed below:
You can run `kafka-pixy -help` to make it list all available command line
parameters.

## Quick Start

This instruction assumes that you are trying it on Linux host, but it will be
pretty much the same on Mac.

### Step 1. Download

```
curl -L https://github.com/mailgun/kafka-pixy/releases/download/v0.12.0/kafka-pixy-v0.12.0-linux-amd64.tar.gz | tar xz
```

### Step 2. Start

```
cd kafka-pixy-v0.12.0-linux-amd64
./kafka-pixy --kafkaPeers "<host1>:9092,...,<hostN>:9092" --zookeeperPeers "<host1>:2181,...,<hostM>:2181"
```

### Step 3. Create Topic (optional)

If your Kafka cluster is configured to require explicit creation of topics, then
create one for your testing (e.g. `foo`). [Here](http://kafka.apache.org/documentation.html#basic_ops_add_topic)
is how you can do that.

### Step 4. Initialize Group Offsets

Consume from the topic on behalf of a consumer group (e.g. `bar`) for the first
time. The consumption will fail with the long polling timeout (3 seconds), but
the important side effect of that is that initial offsets will be stored in
Kafka.

```
curl -G localhost:19092/topics/foo/messages?group=bar
```

Output:

```json
{
"error": "long polling timeout"
}
```

### Step 5. Produce

```
curl -X POST localhost:19092/topics/foo/messages?sync \
-H 'Content-Type: text/plain' \
-d 'blah blah blah'
```

The output tells the partition the message has been submitted to and the offset
it has:

```json
{
"partition": 7,
"offset": 974563
}
```

### Step 6. Consume

```
curl -G localhost:19092/topics/foo/messages?group=bar
```

The output provides the retrieved message as a base64 encoded value along with
some metadata:

```json
{
"key": null,
"value": "YmxhaCBibGFoIGJsYWg=",
"partition": 7,
"offset": 974563
}
```

## License

Kafka-Pixy is under the Apache 2.0 license. See the [LICENSE](LICENSE) file for details.
80 changes: 0 additions & 80 deletions consumer/DESIGN.md

This file was deleted.

26 changes: 26 additions & 0 deletions howto-install.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
This instruction assumes that you are trying it on a Linux host, but it will be
pretty much the same on Mac.

The easiest way to install Kafka-Pixy is to download and unpack a release
archive:

```
curl -L https://github.com/mailgun/kafka-pixy/releases/download/v0.13.0/kafka-pixy-v0.13.0-linux-amd64.tar.gz | tar xz
```

Create a configuration file using `default.yaml` as a template:

```
cd kafka-pixy-v0.13.0-linux-amd64
cp default.yaml config.yaml
```

The default settings in the config file should be good enough for you to give
Kafka-Pixy a try, you can always fine turn it later. But for the first time you
need to at least point it to your Kafka and Zookeeper clusters.

When the config file is updated you can start Kafka-Pixy:

```
./kafka-pixy --config config.yaml
```
65 changes: 65 additions & 0 deletions quick-start-curl.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
Note: HTTP API is only provided to test Kafka-Pixy from command line and use
in use in operations (consumer/offset API). To produce/consume messages
please use [gRPC](http://www.grpc.io/docs/) API. You can use pre-generated
client stubs shipped with kafka-pixy for [Python](gen/python) and
[Golang](gen/golang), or generate them yourself from [grpc.proto](grpc.proto).

This tutorial assumes that topic `foo` exists in your Kafka cluster or your
Kafka is configured to create topics on demand.

To make sure that you will be able to consume the first message you produce in
the scope of this tutorial we need to start by making a consume call from a
consumer group (e.g. `bar`):

```
curl -G localhost:19092/topics/foo/messages?group=bar
```

The call is exected to fail after the configured configured
[long polling timeout](https://github.com/mailgun/kafka-pixy/blob/master/default.yaml#L103)
elapses. But the important side effect is that initial offsets will be stored
in the Kafka cluster for the used consumer group.

Note: You can use any consumer group name, but it has to be used consistently
in all calls.

A message can be produced byt the following call:

```
curl -X POST localhost:19092/topics/foo/messages?sync \
-d 'May the Force be with you!'
```

The message was produced in `sync` mode, that by [default](https://github.com/mailgun/kafka-pixy/blob/master/default.yaml#L70-L78)
means that Kafka-Pixy would wait for all ISR brokers to commit the message
before replying with success. That also ensures that partition and offset
that the message was committed to are returned in response. E.g.:

```json
{
"partition": 7,
"offset": 974563
}
```

To consume earlier produced message call:

```
curl -G localhost:19092/topics/foo/messages?group=bar
```

The output provides the retrieved message as a base64 encoded value along with
some metadata:

```json
{
"key": null,
"value": "YmxhaCBibGFoIGJsYWg=",
"partition": 7,
"offset": 974563
}
```

The `key` is null in our case because we did not specify one when the message
was produced and therefore the partition the message was committed to had been
selected randomly.
1 change: 1 addition & 0 deletions quick-start-golang.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
If you need this, please create an issue and I will add the content in no time.
1 change: 1 addition & 0 deletions quick-start-python.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
If you need this, please create an issue and I will add the content in no time.
12 changes: 8 additions & 4 deletions scripts/release.sh
Original file line number Diff line number Diff line change
Expand Up @@ -27,10 +27,14 @@ mkdir -p ${TARGET_DIR}
for bin in kafka-pixy testconsumer testproducer; do
cp ${BUILD_DIR}/${bin} ${TARGET_DIR}
done
cp ${PROJECT_ROOT}/README.md ${TARGET_DIR}/README.md
cp ${PROJECT_ROOT}/CHANGELOG.md ${TARGET_DIR}/CHANGELOG.md
cp ${PROJECT_ROOT}/LICENSE ${TARGET_DIR}/LICENSE
cp ${PROJECT_ROOT}/default.yaml ${TARGET_DIR}/default.yaml
cp ${PROJECT_ROOT}/README.md ${TARGET_DIR}
cp ${PROJECT_ROOT}/CHANGELOG.md ${TARGET_DIR}
cp ${PROJECT_ROOT}/LICENSE ${TARGET_DIR}
cp ${PROJECT_ROOT}/default.yaml ${TARGET_DIR}
cp ${PROJECT_ROOT}/grpc.proto ${TARGET_DIR}
mkdir ${TARGET_DIR}/grpc_stubs
cp -r ${PROJECT_ROOT}/gen/* ${TARGET_DIR}/grpc_stubs


# Make an archived distribution.
cd ${RELEASE_DIR}
Expand Down

0 comments on commit 06b53fd

Please sign in to comment.