Ecoport is a synergistic eco-friendly transformation portal that leverages the power of data, AI, and IoT Systems.
1. Ecofleet Dashboard π‘
Ecoport.Demo._.Dashboard.-.Google.Chrome.2024-10-12.20-52-52.online-video-cutter.com.mp4
A real-time monitoring and AI-driven solution designed to optimize energy consumption and reduce carbon
emissions in port operations. By integrating IoT sensors, machine learning models, and an interactive dashboard, this system provides actionable insights into
energy usage and recommends strategies to reduce environmental impact while maintaining operational efficiency.
2. Smart Energy Monitoring System β‘
Ecoport_.Dashboards.-.Google.Chrome.2024-10-12.20-54-31.online-video-cutter.com.1.mp4
A smart energy management system that uses AI to forecast energy consumption and optimize energy usage in real-time. It has three key features: temperature & humidity monitoring, fuel monitoring, and fleet tracking.
- Python 3.9+
- Node.js 14+
- Docker (for local containerization)
- PostgreSQL (for database setup)
- Kubernetes Engine API
- Google Container Registry API
- Google Cloud SDK
- Kubectl
git clone https://github.com/Irisss142/Binjai-Kingdom.git
cd Binjai-Kingdom
Create a .env
file in the root directory to store your environment variables:
DATABASE_URL=your_database_url
SECRET_KEY=your_secret_key
Navigate to the /backend
directory and install the required dependencies:
cd backend
pip install -r requirements.txt
Ensure your database is running and initialize it:
python database/init_db.py
python api/app.py
The API should now be running at http://localhost:5000
.
Navigate to the /frontend
directory and install the required dependencies:
cd frontend
npm install
npm start
The frontend should now be running at http://localhost:3000
.
Ensure your historical energy data is stored in the /data
folder. Train the AI model with the following command:
python ai_model/train.py
You can use the trained model to predict future energy usage:
python ai_model/predict.py
You can run the entire system (backend, frontend, and database) locally using Docker Compose:
docker-compose up --build
For cloud deployment, use Kubernetes or Google Cloud Run. Follow the instructions in the infrastructure/
folder:
- Kubernetes Deployment: Use the
k8s-deployment.yaml
for Kubernetes. - Google Cloud Run: Use the
cloud_run_deploy.sh
script for Google Cloud deployment.
- Access the dashboard via
http://localhost:3000
. - View real-time energy consumption and emissions data.
- Receive AI-driven recommendations for optimizing energy usage.
- Use the AI model to forecast future energy usage and carbon emissions.
- Visualize predicted trends in the dashboard.
- Access energy-saving recommendations in the dashboard.
- Apply suggestions to reduce energy consumption during peak operational hours.
- Python 3.9+
- FastAPI
- PostgreSQL
- SQLAlchemy
- Redis & Celery
- Docker
- Google Kubernetes Engine (GKE)
- React
- Plotly.js
- npm & Webpack
- Tailwind CSS
- Docker
- TensorFlow/Keras
- Scikit-learn, Pandas, NumPy
- Google Cloud Platform (GCP)
- Kubernetes Secrets