Skip to content

Commit

Permalink
Version [2.5.0]
Browse files Browse the repository at this point in the history
  • Loading branch information
I-am-PUID-0 committed Jul 22, 2024
1 parent b994b48 commit 4a7b07c
Show file tree
Hide file tree
Showing 18 changed files with 502 additions and 253 deletions.
22 changes: 22 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,28 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).


## Version [2.5.0] - 2024-07-22

### Added

- [Issue #59](https://github.com/I-am-PUID-0/pd_zurg/issues/59): Added PDZURG_LOG_SIZE environment variable to set the maximum size of the log file; Default is 10MB
- [Issue #60](https://github.com/I-am-PUID-0/pd_zurg/issues/60): Added PD_REPO environment variable to set the plex_debrid repository to pull from; Default is `None`

### Changed

- Refactored to use common functions under utils
- Dockerfile: Updated to use the python:3.11-alpine image
- plex_debrid: Updates for plex_debrid are now enabled with PD_UPDATE when PD_REPO is used

### Notes

- The PDZURG_LOG_SIZE environment variable only applies to the pd_zurg log file; not the Zurg or plex_debrid log files.

- The PD_REPO environment variable is used to set the plex_debrid repository to pull from. If used, the value must be a comma seperated list for the GitHub username,repository_name,and optionally the branch; e.g., PD_REPO=itsToggle,plex_debrid,main - the branch is defaulted to main if not specified

- PD_UPDATE is only functional when PD_REPO is used


## Version [2.4.3] - 2024-07-17

### Fixed
Expand Down
8 changes: 7 additions & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
@@ -1,7 +1,13 @@
FROM rclone/rclone:latest
FROM rclone/rclone:latest as rclone-stage

FROM python:3.11-alpine
COPY --from=rclone-stage /usr/local/bin/rclone /usr/local/bin/rclone

WORKDIR /

ADD . / ./
ADD https://raw.githubusercontent.com/debridmediamanager/zurg-testing/main/config.yml /zurg/

ENV \
XDG_CONFIG_HOME=/config \
TERM=xterm
Expand Down
8 changes: 6 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
<div align="center" style="max-width: 100%; height: auto;">
<a href="https://github.com/I-am-PUID-0/DMB">
<a href="https://github.com/I-am-PUID-0/pd_zurg">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://github.com/I-am-PUID-0/pd_zurg/assets/36779668/da811d50-18bf-4498-b508-2b1a6ed848bc">
<img alt="pd_zurg" src="https://github.com/I-am-PUID-0/pd_zurg/assets/36779668/da811d50-18bf-4498-b508-2b1a6ed848bc" style="max-width: 100%; height: auto;">
Expand Down Expand Up @@ -97,6 +97,7 @@ services:
# - JF_API_KEY
## Plex Debrid Optional Settings
# - PD_UPDATE=true # deprecated; plex_drbrid is no longer maintained
# - PD_REPO=itsToggle,plex_debrid,main
# - SHOW_MENU=false
# - SEERR_API_KEY=
# - SEERR_ADDRESS=
Expand All @@ -106,6 +107,7 @@ services:
# - CLEANUP_INTERVAL=1
# - PDZURG_LOG_LEVEL=DEBUG
# - PDZURG_LOG_COUNT=2
# - PDZURG_LOG_SIZE=10M
# Example to attach to gluetun vpn container if realdebrid blocks IP address
# network_mode: container:gluetun
devices:
Expand Down Expand Up @@ -179,12 +181,14 @@ of this parameter has the format `<VARIABLE_NAME>=<VALUE>`.
|`SHOW_MENU`| Enable the plex_debrid menu to show upon startup, requiring user interaction before the program runs. Conversely, if the plex_debrid menu is disabled, the program will automatically run upon successful startup. If used, the value must be ```true``` or ```false``` | `true` |
|`PD_ENABLED`| Set the value "true" to enable the plex_debrid process | `false ` | | :heavy_check_mark: | |
|`PD_LOGFILE`| Log file for plex_debrid. The log file will appear in the ```/config``` as ```plex_debrid.log```. If used, the value must be ```true``` or ```false``` | `false` |
|~~`PD_UPDATE`~~| ~~Enable automatic updates of plex_debrid. Adding this variable will enable automatic updates to the latest version of plex_debrid locally within the container.~~ deprecated; plex_drbrid is no longer maintained| `false` |
|`PD_UPDATE`| Enable automatic updates of plex_debrid. Adding this variable will enable automatic updates to the latest version of plex_debrid locally within the container. Only enabled if PD_REPO is set| `false` |
|`PD_REPO`| The repository to use for plex_debrid. If used, the value must be a comma seperated list for the GitHub username,repository name,and optionally the branch; e.g., "itsToggle,plex_debrid,main" | `None` |
|`AUTO_UPDATE_INTERVAL`| Interval between automatic update checks in hours. Vaules can be any positive [whole](https://www.oxfordlearnersdictionaries.com/us/definition/english/whole-number) or [decimal](https://www.oxfordreference.com/display/10.1093/oi/authority.20110803095705740;jsessionid=3FDC96CC0D79CCE69702661D025B9E9B#:~:text=The%20separator%20used%20between%20the,number%20expressed%20in%20decimal%20representation.) point based number. Ex. a value of .5 would yield thirty minutes, and 1.5 would yield one and a half hours | `24` |
|`DUPLICATE_CLEANUP`| Automated cleanup of duplicate content in Plex. | `false` |
|`CLEANUP_INTERVAL`| Interval between duplicate cleanup in hours. Values can be any positive [whole](https://www.oxfordlearnersdictionaries.com/us/definition/english/whole-number) or [decimal](https://www.oxfordreference.com/display/10.1093/oi/authority.20110803095705740;jsessionid=3FDC96CC0D79CCE69702661D025B9E9B#:~:text=The%20separator%20used%20between%20the,number%20expressed%20in%20decimal%20representation.) point based number. Ex. a value of .5 would yield thirty minutes and 1.5 would yield one and a half hours | `24` |
|`PDZURG_LOG_LEVEL`| The level at which logs should be captured. See the python [Logging Levels](https://docs.python.org/3/library/logging.html#logging-levels) documentation for more details | `INFO` |
|`PDZURG_LOG_COUNT`| The number logs to retain. Result will be value + current log | `2` |
|`PDZURG_LOG_SIZE`| The size of the log file before it is rotated. Valid options are 'K' (kilobytes), 'M' (megabytes), and 'G' (gigabytes) | `10M` |
|`ZURG_ENABLED`| Set the value "true" to enable the Zurg process | `false ` | | | :heavy_check_mark:|
|`ZURG_VERSION`| The version of Zurg to use. If enabled, the value should contain v0.9.x or v0.9.x-hotfix.x format | `latest` | | | |
|`ZURG_UPDATE`| Enable automatic updates of Zurg. Adding this variable will enable automatic updates to the latest version of Zurg locally within the container. | `false` | | | |
Expand Down
214 changes: 3 additions & 211 deletions base/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
from dotenv import load_dotenv, find_dotenv
from datetime import datetime, timedelta
import logging
from logging.handlers import TimedRotatingFileHandler
from logging.handlers import RotatingFileHandler, TimedRotatingFileHandler, BaseRotatingHandler
from packaging.version import Version, parse as parse_version
import time
import os
Expand All @@ -26,215 +26,6 @@

load_dotenv(find_dotenv('./config/.env'))

class SubprocessLogger:
def __init__(self, logger, key_type):
self.logger = logger
self.key_type = key_type
self.log_methods = {
'DEBUG': logger.debug,
'INFO': logger.info,
'NOTICE': logger.debug,
'WARNING': logger.warning,
'ERROR': logger.error,
'UNKNOWN': logger.info
}
@staticmethod
def parse_log_level_and_message(line, process_name):
log_levels = {'DEBUG', 'INFO', 'NOTICE', 'WARNING', 'ERROR'}
log_level = None
message = None
log_level_pattern = re.compile(r'({})\s*(.*)'.format('|'.join(log_levels)))
match = log_level_pattern.search(line)

if match:
log_level = match.group(1)
message = match.group(2).strip()
if process_name == 'rclone' and message.startswith(': '):
message = message[2:]
else:
log_level = 'UNKNOWN'
message = line
return log_level, message

def monitor_stderr(self, process, mount_name, process_name):
for line in process.stderr:
if isinstance(line, bytes):
line = line.decode().strip()
else:
line = line.strip()
if line:
log_level, message = SubprocessLogger.parse_log_level_and_message(line, process_name)
log_func = self.log_methods.get(log_level, self.logger.info)
if process_name == 'rclone':
log_func(f"rclone mount name \"{mount_name}\": {message}")
else:
log_func(f"{process_name}: {message}")

def start_monitoring_stderr(self, process, mount_name, process_name):
threading.Thread(target=self.monitor_stderr, args=(process, mount_name, process_name)).start()

def log_subprocess_output(self, pipe):
try:
for line in iter(pipe.readline, ''):
if isinstance(line, bytes):
line = line.decode().strip()
else:
line = line.strip()
if line:
log_level, message = SubprocessLogger.parse_log_level_and_message(line, self.key_type)
log_func = self.log_methods.get(log_level, self.logger.info)
log_func(f"{self.key_type} subprocess: {message}")
except ValueError as e:
self.logger.error(f"Error reading subprocess output for {self.key_type}: {e}")

def start_logging_stdout(self, process):
log_thread = threading.Thread(target=self.log_subprocess_output, args=(process.stdout,))
log_thread.daemon = True
log_thread.start()

class MissingAPIKeyException(Exception):
def __init__(self):
self.message = "Please set the debrid API Key: environment variable is missing from the docker-compose file"
super().__init__(self.message)

class MissingEnvironmentVariable(Exception):
def __init__(self, variable_name):
self.variable_name = variable_name
message = f"Environment variable '{variable_name}' is missing."
super().__init__(message)

def log_exception(self, logger):
logger.error(f"Missing environment variable: {self.variable_name}")

class ConfigurationError(Exception):
def __init__(self, error_message):
self.error_message = error_message
super().__init__(self.error_message)

def format_time(interval):
interval_hours = int(interval)
interval_minutes = int((interval - interval_hours) * 60)

if interval_hours == 1 and interval_minutes == 0:
return "1 hour"
elif interval_hours == 1 and interval_minutes != 0:
return f"1 hour {interval_minutes} minutes"
elif interval_hours != 1 and interval_minutes == 0:
return f"{interval_hours} hours"
else:
return f"{interval_hours} hours {interval_minutes} minutes"

def get_start_time():
start_time = time.time()
return start_time

def time_to_complete(start_time):
end_time = time.time()
elapsed_time = end_time - start_time

hours = int(elapsed_time // 3600)
minutes = int((elapsed_time % 3600) // 60)
seconds = int(elapsed_time % 60)

time_string = ""
if hours > 0:
time_string += f"{hours} hour(s) "
if minutes > 0:
time_string += f"{minutes} minute(s) "
if seconds > 0:
time_string += f"{seconds} second(s)"
return time_string

class CustomTimedRotatingFileHandler(TimedRotatingFileHandler):
def __init__(self, filename, when='h', interval=1, backupCount=0, encoding=None, delay=False, utc=False, atTime=None):
self.rollover_filename = filename
TimedRotatingFileHandler.__init__(self, self.rollover_filename, when, interval, backupCount, encoding, delay, utc, atTime)

def doRollover(self):
if self.stream:
self.stream.close()
self.stream = None

base_file_name_without_date = self.baseFilename.rsplit('-', 3)[0]
current_date = time.strftime("%Y-%m-%d")
correct_filename = base_file_name_without_date + '-' + current_date + '.log'

if self.rollover_filename != correct_filename:
new_filename = correct_filename
else:
new_filename = self.rollover_filename

filenames_to_delete = self.getFilesToDelete()
for filename in filenames_to_delete:
os.remove(filename)

self.rollover_filename = new_filename
self.baseFilename = self.rollover_filename
self.stream = self._open()

new_rollover_at = self.computeRollover(self.rolloverAt)
while new_rollover_at <= time.time():
new_rollover_at = new_rollover_at + self.interval
if self.utc:
dst_at_rollover = time.localtime(new_rollover_at)[-1]
else:
dst_at_rollover = time.gmtime(new_rollover_at)[-1]

if time.localtime(time.time())[-1] != dst_at_rollover:
addend = -3600 if time.localtime(time.time())[-1] else 3600
new_rollover_at += addend
self.rolloverAt = new_rollover_at

def getFilesToDelete(self):
dirName, baseName = os.path.split(self.baseFilename)
fileNames = os.listdir(dirName)
result = []
prefix = baseName.split('-', 1)[0] + "-"
plen = len(prefix)
for fileName in fileNames:
if fileName[:plen] == prefix:
suffix = fileName[plen:]
if re.compile(r"^\d{4}-\d{2}-\d{2}.log$").match(suffix):
result.append(os.path.join(dirName, fileName))
result.sort()
if len(result) < self.backupCount:
result = []
else:
result = result[:len(result) - self.backupCount]
return result

def get_logger(log_name='PDZURG', log_dir='./log'):
current_date = time.strftime("%Y-%m-%d")
log_filename = f"{log_name}-{current_date}.log"
logger = logging.getLogger(log_name)
backupCount_env = os.getenv('PDZURG_LOG_COUNT')
try:
backupCount = int(backupCount_env)
except (ValueError, TypeError):
backupCount = 2
log_level_env = os.getenv('PDZURG_LOG_LEVEL')
if log_level_env:
log_level = log_level_env.upper()
os.environ['LOG_LEVEL'] = log_level
os.environ['RCLONE_LOG_LEVEL'] = log_level
else:
log_level = 'INFO'
numeric_level = getattr(logging, log_level, logging.INFO)
logger.setLevel(numeric_level)
log_path = os.path.join(log_dir, log_filename)
handler = CustomTimedRotatingFileHandler(log_path, when="midnight", interval=1, backupCount=backupCount)
os.chmod(log_path, 0o666)
formatter = logging.Formatter('%(asctime)s - %(levelname)s - %(message)s', datefmt='%b %e, %Y %H:%M:%S')
handler.setFormatter(formatter)
stdout_handler = logging.StreamHandler(sys.stdout)
stdout_handler.setFormatter(formatter)

for hdlr in logger.handlers[:]:
logger.removeHandler(hdlr)
logger.addHandler(handler)
logger.addHandler(stdout_handler)
return logger


def load_secret_or_env(secret_name, default=None):
secret_file = f'/run/secrets/{secret_name}'
Expand All @@ -260,6 +51,7 @@ def load_secret_or_env(secret_name, default=None):
SHOWMENU = os.getenv('SHOW_MENU')
LOGFILE = os.getenv('PD_LOGFILE')
PDUPDATE = os.getenv('PD_UPDATE')
PDREPO = os.getenv('PD_REPO')
DUPECLEAN = os.getenv('DUPLICATE_CLEANUP')
CLEANUPINT = os.getenv('CLEANUP_INTERVAL')
RCLONEMN = os.getenv("RCLONE_MOUNT_NAME")
Expand All @@ -271,4 +63,4 @@ def load_secret_or_env(secret_name, default=None):
PLEXMOUNT = os.getenv('PLEX_MOUNT_DIR')
NFSMOUNT = os.getenv('NFS_ENABLED')
NFSPORT = os.getenv('NFS_PORT')
ZURGPORT = os.getenv('ZURG_PORT')
ZURGPORT = os.getenv('ZURG_PORT')
1 change: 1 addition & 0 deletions cleanup/duplicate_cleanup.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
from base import *
from utils.logger import *
from plexapi.server import PlexServer
from plexapi import exceptions as plexapi_exceptions
from requests.exceptions import HTTPError
Expand Down
2 changes: 2 additions & 0 deletions healthcheck.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,6 @@
from base import *
from utils.logger import *


def check_processes(process_info):
found_processes = {key: False for key in process_info.keys()}
Expand Down
14 changes: 9 additions & 5 deletions main.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
from base import *
from utils.logger import *
import plex_debrid_ as p
import zurg as z
from rclone import rclone
Expand All @@ -9,7 +10,7 @@
def main():
logger = get_logger()

version = '2.4.3'
version = '2.5.0'

ascii_art = f'''
Expand Down Expand Up @@ -82,10 +83,13 @@ def healthcheck():
try:
p.setup.pd_setup()
pd_updater = p.update.PlexDebridUpdate()
#if PDUPDATE:
#pd_updater.auto_update('plex_debrid',True)
#else:
pd_updater.auto_update('plex_debrid',False)
if PDUPDATE:
pd_updater.auto_update('plex_debrid',True)
elif PDREPO:
p.download.get_latest_release()
pd_updater.auto_update('plex_debrid',False)
else:
pd_updater.auto_update('plex_debrid',False)
except Exception as e:
logger.error(f"An error occurred in the plex_debrid setup: {e}")
except:
Expand Down
Loading

0 comments on commit 4a7b07c

Please sign in to comment.