Skip to content

manage and query large log datasets with a scalable log ingestor and query interface, offering real time data ingestion

License

Notifications You must be signed in to change notification settings

siAyush/ingestor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

33 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


Logo

Ingestor

Log Ingestor and Query Interface, with real-time ingestion, filtering, and a user interface.

Features

  • Filters on specific fields
  • Search in given time range
  • HTTP endpoint for posting logs
  • Kafka queue for streamlined Log processing
  • Ingestion Buffer & Batch processing
  • Efficient search queries leveraging Elastic DB
  • Export logs in json
  • Add new logs via http

Project Architecture & UI

System Design

Logo

Frontend

Logo

Built With

  • Python
  • Typescript
  • Go
  • Kafka
  • Elastic Search
  • NextJs
  • Docker
  • Kibana

Getting Started

Installation & usage

  • Clone the repo

    git clone https://github.com/siAyush/ingestor.git
  • Run nextjs app

    1. Go to web directory

        cd web
    2. Install golang dependencies

       pnpm i
       pnpm run dev
  • Run ingestor server

    1. Go to server directory

        cd server
    2. Install golang dependencies

       go mod download
    3. Install Python dependencies

        pip install -r requirements.txt
    4. Start producing logs

       cd server/logProducers
       ./runProducers.sh
    5. Start the ingestor server

       cd server/cmd
       go run .

API Documentation

Ingestion Routes

1. New Log Ingestion

  • Endpoint: POST /add-log

  • Description: Ingests a new log entry into the system.

    Request Example:

    {
      "level": "error",
      "message": "Failed to connect to DB",
      "resourceId": "server-1234",
      "timestamp": "2023-09-15T08:00:00Z",
      "traceId": "abc-xyz-123",
      "spanId": "span-456",
      "commit": "5e5342f",
      "topic": "auth",
      "metadata": {
        "parentResourceId": "server-0987"
      }
    }

    Response Example:

    {
      "status": "success"
    }

2. Count Logs

  • Endpoint: GET /logs-count

  • Description: Retrieves the count of logs stored in Elasticsearch.

    Response Example:

    {
      "count": 5286
    }

3. Search Logs

  • Endpoint: POST /all-logs

  • Description: Searches for logs based on log level log topic and dates.

    Request Example:

    {
      "logLevel": "info",
      "topic": "auth",
      "startTime": "2024-11-19T00:00:00Z",
      "endTime": "2024-11-19T23:59:59Z"
    }

    Response Example:

    {
      "logs": [
        {
          "_id": "iIG-HpIBzyeB8mG4657K",
          "_index": "ingestor",
          "_score": 2.352951,
          "_source": {
            "commit": "7a91bc3",
            "level": "info",
            "message": "User authentication successful",
            "metadata": {
              "authType": "basic",
              "parentResourceId": "server-1234",
              "username": "john_doe"
            },
            "resourceId": "user-5678",
            "spanId": "span-789",
            "timestamp": "2023-09-15T08:15:00Z",
            "topic": "auth",
            "traceId": "def-uvw-456"
          }
        }
      ]
    }

About

manage and query large log datasets with a scalable log ingestor and query interface, offering real time data ingestion

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published