- Dec 06, 2020
-
-
Seok Won authored
-
defaultSouth authored
-
Seok Won authored
1시간마다 모든 공지를 불러 {"TITLE": "제목", "DATE": "올린 날", "LINK": "http 주소", "WRITER": "글쓴이"}를 json 형태로 저장한다. 이 json 파일이나 새로운 공지가 있으면 기존 json과 비교해서 새로운 데이터를 Consumer로 보내고, Consumer는 새로운 데이터를 받으면, Slack API를 이용해, "#아주대" 채널에 공지를 올려준다. 마지막 파싱 시간도 기록해 종료 후 다시 불러도 1시간이 지나지 않으면 파싱하지 않는다. 결과) Last parsing: 1972-12-01 07:00:00 Trying to parse new posts... Sending a new post...: 12179 ... Last parsing: 2020-12-04 19:11:42.839219 Trying to parse new posts... No new posts yet... Resting 1 hour... ... Last parsing: 2020-12-06 11:55:35.386262 Wait for 3494 seconds to sync new posts.
-
- Dec 03, 2020
-
-
defaultSouth authored
-
defaultSouth authored
-
Seok Won authored
this producer reads all messages from "#general", in every 5 seconds (this will cause ratelimited) and if messages contain a word "bug", it will automatically sends "USERNAME" and "MESSAGE" to Kafka consumer. And the consumer will leave a message in "#kafka" channel saying "USER found a bug ..."
-
- Dec 01, 2020
-
-
defaultSouth authored
-
defaultSouth authored
-
Seok Won authored
modified version of an official confluent example. topic id: SLACK-KAFKA this will automatically posts a message to your specified channel in Slack, if someone leaves a bad review (for now, you can just send a json data to test from CLI, see the README file.)
-
Seok Won authored
-
- Nov 30, 2020
-
-
Seok Won authored
Create a topic named "twitter-tweets" with below command, kafka-topics --zookeeper localhost:2181 --create --topic twitter-tweets --partitions 6 --replication-factor 1 and run this application.
-
Seok Won authored
Next we will build kafka producer to send tweets to kafka. https://github.com/twitter/hbc
-
Seok Won authored
In this example, we read "first-topic", in partition 0, from offset 5 until offset becomes 10. [main] INFO csw.kafka.study.lesson2.ConsumerDemoAssignSeek - Key: null, Value: five [main] INFO csw.kafka.study.lesson2.ConsumerDemoAssignSeek - Partition: 0, Offset: 5 [main] INFO csw.kafka.study.lesson2.ConsumerDemoAssignSeek - Key: null, Value: Hello World! [main] INFO csw.kafka.study.lesson2.ConsumerDemoAssignSeek - Partition: 0, Offset: 6 [main] INFO csw.kafka.study.lesson2.ConsumerDemoAssignSeek - Key: id_1, Value: Hello 1 [main] INFO csw.kafka.study.lesson2.ConsumerDemoAssignSeek - Partition: 0, Offset: 7 [main] INFO csw.kafka.study.lesson2.ConsumerDemoAssignSeek - Key: id_3, Value: Hello 3 [main] INFO csw.kafka.study.lesson2.ConsumerDemoAssignSeek - Partition: 0, Offset: 8 [main] INFO csw.kafka.study.lesson2.ConsumerDemoAssignSeek - Key: key_1, Value: world [main] INFO csw.kafka.study.lesson2.ConsumerDemoAssignSeek - Partition: 0, Offset: 9 [main] INFO csw.kafka.study.lesson2.ConsumerDemoAssignSeek - Exiting...
-
Seok Won authored
+ python config updates for consumers + README updates
-
Seok Won authored
-
- Nov 29, 2020
-
-
Seok Won authored
it reads all datas in three partitions from a topic named "first-topic". I sent datas in String, like "hello world".
-
- Nov 28, 2020
-
-
defaultSouth authored
-
Seok Won authored
At line 34, we added key parameter. Whenever we use same key when sending a data, same key always use same partition. i.e) i = 0, key = "truck_1" -> partition_0 i = 1, key = "truck_1" -> partition_0 ...
-
Seok Won authored
for example, whenever we send a data, this callback function will execute. 새 메타데이터 Topic: first-topic Partition: 2 Offset: 13 Timestamp: 1606555474565
-
defaultSouth authored
-
defaultSouth authored
-
defaultSouth authored
-
Seok Won authored
Send "Hello World!" string to topic named "first-topic". Producer sends a data to Consumer asynchronous, so we have to flush to see results right away. Execute below command to see "Hello World!" >> kafka-console-consumer --bootstrap-server localhost:9092 --topic first-topic --group group-one
-
Seok Won authored
-