- Filters on specific fields
- Search in given time range
- HTTP endpoint for posting logs
- Kafka queue for streamlined Log processing
- Ingestion Buffer & Batch processing
- Efficient search queries leveraging Elastic DB
- Export logs in json
- Add new logs via http
- Python
- Typescript
- Go
- Kafka
- Elastic Search
- NextJs
- Docker
- Kibana
-
Clone the repo
git clone https://github.com/siAyush/ingestor.git
-
Run nextjs app
-
Go to
web
directorycd web
-
Install golang dependencies
pnpm i pnpm run dev
-
-
Run ingestor server
-
Go to
server
directorycd server
-
Install golang dependencies
go mod download
-
Install Python dependencies
pip install -r requirements.txt
-
Start producing logs
cd server/logProducers ./runProducers.sh
-
Start the ingestor server
cd server/cmd go run .
-
-
Endpoint:
POST /add-log
-
Description: Ingests a new log entry into the system.
Request Example:
{ "level": "error", "message": "Failed to connect to DB", "resourceId": "server-1234", "timestamp": "2023-09-15T08:00:00Z", "traceId": "abc-xyz-123", "spanId": "span-456", "commit": "5e5342f", "topic": "auth", "metadata": { "parentResourceId": "server-0987" } }
Response Example:
{ "status": "success" }
-
Endpoint:
GET /logs-count
-
Description: Retrieves the count of logs stored in Elasticsearch.
Response Example:
{ "count": 5286 }
-
Endpoint:
POST /all-logs
-
Description: Searches for logs based on log level log topic and dates.
Request Example:
{ "logLevel": "info", "topic": "auth", "startTime": "2024-11-19T00:00:00Z", "endTime": "2024-11-19T23:59:59Z" }
Response Example:
{ "logs": [ { "_id": "iIG-HpIBzyeB8mG4657K", "_index": "ingestor", "_score": 2.352951, "_source": { "commit": "7a91bc3", "level": "info", "message": "User authentication successful", "metadata": { "authType": "basic", "parentResourceId": "server-1234", "username": "john_doe" }, "resourceId": "user-5678", "spanId": "span-789", "timestamp": "2023-09-15T08:15:00Z", "topic": "auth", "traceId": "def-uvw-456" } } ] }