-
-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Migrate streaming providers functions to async & improve performance #331
Conversation
Warning Rate limit exceeded@mhdzumair has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 8 minutes and 3 seconds before requesting another review. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. WalkthroughThis pull request introduces significant modifications across multiple files in the streaming providers' module, primarily transitioning various classes and functions from synchronous to asynchronous implementations. Key changes include the conversion of numerous methods to use the Changes
Possibly related PRs
Poem
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 41
🧹 Outside diff range and nitpick comments (35)
streaming_providers/debridlink/api.py (1)
21-25
: Async implementation looks great, consider adding error handlingThe transition to using an async context manager for the DebridLink client in the
authorize
function is well implemented. This change:
- Ensures proper resource management
- Allows for non-blocking I/O operations, potentially improving performance
- Maintains backwards compatibility by keeping the function signature unchanged
The implementation correctly awaits the result of
dl_client.authorize(data.device_code)
and uses the constantNO_CACHE_HEADERS
, maintaining consistency with the previous version.Consider adding error handling to manage potential exceptions that might occur during the authorization process. This could improve the robustness of the function. For example:
@router.post("/authorize") async def authorize(data: AuthorizeData): try: async with DebridLink() as dl_client: result = await dl_client.authorize(data.device_code) return JSONResponse( content=result, headers=const.NO_CACHE_HEADERS, ) except Exception as e: # Log the error logger.error(f"Authorization failed: {str(e)}") return JSONResponse( content={"error": "Authorization failed"}, status_code=500, headers=const.NO_CACHE_HEADERS, )This suggestion adds basic error handling and provides a more informative response in case of failures.
streaming_providers/premiumize/api.py (1)
24-35
: Consider specifying the HTTP status code for error responsesWhen returning error messages, specifying the appropriate HTTP status code can help clients handle responses correctly. For example, use a 400 status code for bad requests or 500 for internal server errors.
Here's how you might adjust the error response:
from fastapi import HTTPException, status # In the exception handling block except Exception as e: raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail=str(e))streaming_providers/alldebrid/utils.py (3)
10-20
: Consider adding type hints for better code clarityAdding type hints to the
get_torrent_info
function will improve code readability and help with static analysis tools.Apply this diff to include type hints:
async def get_torrent_info(ad_client, info_hash): +async def get_torrent_info(ad_client: AllDebrid, info_hash: str) -> Optional[Dict]: torrent_info = await ad_client.get_available_torrent(info_hash) if torrent_info and torrent_info["status"] == "Ready": return torrent_info
Ensure to import
Optional
andDict
from thetyping
module if not already imported.from typing import Optional, Dict
23-25
: Add type hints and handle potential exceptionsFor improved code clarity and robustness, consider adding type hints to the
add_new_torrent
function. Additionally, ensure that exceptions fromad_client.add_magnet_link
are appropriately handled.Apply this diff to include type hints:
async def add_new_torrent(ad_client, magnet_link): +async def add_new_torrent(ad_client: AllDebrid, magnet_link: str) -> str: response_data = await ad_client.add_magnet_link(magnet_link) return response_data["data"]["magnets"][0]["id"]
Line range hint
28-45
: Add type hints towait_for_download_and_get_link
functionAdding type hints will enhance readability and assist with static analysis.
Apply this diff to include type hints:
async def wait_for_download_and_get_link( - ad_client, torrent_id, filename, file_index, episode, max_retries, retry_interval + ad_client: AllDebrid, + torrent_id: int, + filename: str, + file_index: int, + episode: Optional[int], + max_retries: int, + retry_interval: int, ) -> str: torrent_info = await ad_client.wait_for_status(Ensure to import
Optional
from thetyping
module if not already imported.from typing import Optionalstreaming_providers/alldebrid/client.py (5)
15-16
: Consider removingasync
frominitialize_headers
The method
initialize_headers
is marked asasync
but does not contain anyawait
expressions. Unless there is a specific need for it to be asynchronous, consider removing theasync
keyword to simplify the code.
18-19
: Consider removingasync
fromdisable_access_token
The method
disable_access_token
is defined as asynchronous but doesn't perform any asynchronous operations. If there's no intention to add asynchronous code here in the future, you might remove theasync
keyword.
21-22
: Consider removingasync
from_handle_service_specific_errors
The method
_handle_service_specific_errors
is declared asasync
but doesn't contain anyawait
statements. If asynchronous operations aren't required, consider making it a regular method.
Line range hint
50-68
: Ensure consistent error message formattingIn the
_validate_error_response
method, some error messages have been simplified by removing unnecessary f-string formatting. However, other parts of the code still use f-strings for static messages. For consistency and clarity, consider standardizing how error messages are formatted throughout the codebase.
Line range hint
107-118
: Enhance exception handling increate_download_link
The exception raised in
create_download_link
includes the full response in the message, which may expose sensitive information. Consider logging the detailed response for debugging purposes and raising a user-friendly exception message instead.Apply this change to improve exception handling:
def create_download_link(self, link): response = await self._make_request( "GET", "/link/unlock", params={"link": link}, is_expected_to_fail=True, ) if response.get("status") == "success": return response - raise ProviderException( - f"Failed to create download link from AllDebrid {response}", - "transfer_error.mp4", - ) + # Log the detailed response for debugging + logger.error(f"Failed to create download link from AllDebrid: {response}") + # Raise a user-friendly exception + raise ProviderException( + "Failed to create download link from AllDebrid.", + "transfer_error.mp4", + )streaming_providers/premiumize/client.py (2)
Line range hint
77-88
: Handle potential KeyError inget_torrent_info
.In
get_torrent_info
, the code assumes thattransfer_list
contains a"transfers"
key. To prevent a potentialKeyError
, consider adding a check to ensure that"transfers"
exists intransfer_list
before accessing it.Apply the following change to handle the potential error:
async def get_torrent_info(self, torrent_id): transfer_list = await self.get_transfer_list() + if "transfers" not in transfer_list: + raise ProviderException("Missing 'transfers' in response", "transfer_error.mp4") torrent_info = next( ( torrent for torrent in transfer_list["transfers"] if torrent["id"] == torrent_id ), None, ) return torrent_info
Line range hint
120-134
: Verify the integrity of response handling inget_available_torrent
.In
get_available_torrent
, ensure that the response fromget_transfer_list
contains the"transfers"
key before accessing it to prevent potentialKeyError
. Consider adding a check and handling unexpected responses gracefully.Apply the following change to handle potential missing keys:
torrent_list_response = await self.get_transfer_list() if torrent_list_response.get("status") != "success": if torrent_list_response.get("message") == "Not logged in.": raise ProviderException( "Premiumize is not logged in.", "invalid_token.mp4" ) raise ProviderException( "Failed to get torrent info from Premiumize", "transfer_error.mp4" ) + if "transfers" not in torrent_list_response: + raise ProviderException( + "Missing 'transfers' in response", "transfer_error.mp4" + ) available_torrents = torrent_list_response["transfers"]streaming_providers/realdebrid/utils.py (2)
59-60
: Correct grammatical error in error messageThere's a grammatical error in the error message: "Torrent is already being downloading". It should be "Torrent is already downloading" or "Torrent is already being downloaded".
Apply this diff to fix the error message:
- "Torrent is already being downloading", "torrent_not_downloaded.mp4" + "Torrent is already downloading", "torrent_not_downloaded.mp4"
117-117
: Consider logging exceptions instead of passing silentlyIn the
update_rd_cache_status
function, theexcept ProviderException:
block silently passes on exceptions. Suppressing exceptions without any logging can make debugging difficult.Consider logging the exception or providing feedback to help with troubleshooting. For example:
except ProviderException as e: logger.error(f"Failed to update RD cache status: {e}")streaming_providers/premiumize/utils.py (1)
43-44
: Improve error message grammarThe error message "Not enough seeders available for parse magnet link" is grammatically incorrect. Consider rephrasing it for clarity.
Suggested change:
raise ProviderException( - "Not enough seeders available for parse magnet link", + "Not enough seeders available to parse magnet link", "transfer_error.mp4", )streaming_providers/offcloud/client.py (2)
17-18
: Unnecessary use ofasync
ininitialize_headers
The
initialize_headers
method does not perform any asynchronous operations. Declaring it as anasync
function is unnecessary and may introduce unintended overhead. Consider removing theasync
keyword.Apply this diff to fix the issue:
- async def initialize_headers(self): + def initialize_headers(self): pass
20-26
: Unnecessary use ofasync
in_handle_service_specific_errors
The
_handle_service_specific_errors
method does not perform any asynchronous operations. Declaring it as anasync
function is unnecessary. Consider removing theasync
keyword.Apply this diff to fix the issue:
- async def _handle_service_specific_errors(self, error: httpx.HTTPStatusError): + def _handle_service_specific_errors(self, error: httpx.HTTPStatusError): if error.response.status_code == 403: raise ProviderException("Invalid OffCloud API key", "invalid_token.mp4") if error.response.status_code == 429: raise ProviderException( "OffCloud rate limit exceeded", "too_many_requests.mp4" )streaming_providers/realdebrid/client.py (5)
Line range hint
20-34
: Confirm Python Version Compatibility formatch
StatementThe use of the
match
statement in_handle_service_specific_errors
relies on Python 3.10 or newer. Ensure that the project's runtime environment supports Python 3.10+, or refactor thematch
statement to useif-elif
conditions for broader compatibility.If needed, refactor using
if-elif
statements:- match error_code: - case 9: + if error_code == 9: raise ProviderException( "Real-Debrid Permission denied for free account", "need_premium.mp4", ) - case 22: + elif error_code == 22: raise ProviderException( "IP address not allowed", "ip_not_allowed.mp4" ) - case 34: + elif error_code == 34: raise ProviderException( "Too many requests", "too_many_requests.mp4" ) - case 35: + elif error_code == 35: raise ProviderException( "Content marked as infringing", "content_infringing.mp4" )
Line range hint
59-70
: Ensureinitialize_headers
Is Properly AwaitedThe
initialize_headers
method has been converted to an asynchronous function. Ensure that any code calling this method awaits it appropriately. If overridden from the base class, verify that the base class method is also asynchronous and that all calls are updated.Search for all usages of
initialize_headers
and update them to includeawait
:#!/bin/bash # Description: Find all calls to initialize_headers that need to be updated. # Test: Search for initialize_headers calls not preceded by await. rg --type python --fixed-strings 'initialize_headers(' | rg -v 'await'
163-166
: Unnecessaryreturn
Statement indisable_access_token
The method
disable_access_token
performs an action but does not need to return any value. Thereturn
statement can be omitted for clarity, especially since the method is intended to returnNone
.Consider removing the
return
statement:- return await self._make_request( + await self._make_request( "GET", f"{self.BASE_URL}/disable_access_token", is_return_none=True, is_expected_to_fail=True, )
Line range hint
185-202
: Enhance Error Handling increate_download_link
The method checks for
"download"
and"error_code"
keys in theresponse
, but if neither is present, it raises a generic exception. Consider adding more detailed error handling or logging to aid in debugging unexpected responses.You might want to log the unexpected response or include additional information in the exception:
raise ProviderException( - f"Failed to create download link. response: {response}", "api_error.mp4" + f"Failed to create download link. Unexpected response: {response}", "api_error.mp4" )
2-2
: Remove Unused ImportAny
The
Any
type is imported but not used in the code. Cleaning up unused imports can improve code readability.-from typing import Any, Optional +from typing import Optionalstreaming_providers/torbox/client.py (3)
10-11
: Consider removingasync
frominitialize_headers
The method
initialize_headers
is declared as asynchronous, but it doesn't contain anyawait
expressions or asynchronous operations. If it's not required to be asynchronous (e.g., not overriding an async method in the base class), consider removing theasync
keyword to simplify the code.
16-17
: Consider removingasync
from_handle_service_specific_errors
The method
_handle_service_specific_errors
is declared as asynchronous but does not contain anyawait
expressions or perform asynchronous operations. If this method doesn't need to be async (e.g., not overriding an async method in the base class), consider removing theasync
keyword.
Line range hint
79-90
: Enhance error handling increate_download_link
methodThe condition
if "successfully" in response.get("detail")
relies on substring matching, which may not be robust against different response messages. Consider checking for a specific success status or code in the response to make error handling more reliable.Apply this diff to improve the error checking:
- if "successfully" in response.get("detail"): + if response.get("status") == "success":Ensure that the API indeed returns a
status
field with a value indicating success.streaming_providers/debridlink/utils.py (3)
12-12
: Consider using union types if supporting Python 3.10 or higherThe parameter
episode
has been annotated withOptional[int]
. If your project exclusively supports Python 3.10 or newer, you might use the union type syntaxint | None
for improved readability.Also applies to: 30-30
79-80
: Log exceptions instead of passing silentlyIn the
except ProviderException:
blocks, exceptions are caught but not logged. This can make debugging difficult. Consider logging the exceptions to aid in troubleshooting.Apply this diff to add logging (assuming a configured logger):
# In update_dl_cache_status except ProviderException as e: - pass + logger.error(f"ProviderException in update_dl_cache_status: {e}") # In fetch_downloaded_info_hashes_from_dl except ProviderException as e: - return [] + logger.error(f"ProviderException in fetch_downloaded_info_hashes_from_dl: {e}") return []Also applies to: 94-95
1-2
: Remove unused imports if they are unnecessaryEnsure that all imported modules are used in the code. If
asyncio
orOptional
are not required elsewhere, consider removing them to clean up the imports.streaming_providers/debridlink/client.py (7)
Line range hint
13-17
: Consider removingasync
from_handle_service_specific_errors
The method
_handle_service_specific_errors
has been converted to anasync
function but does not contain anyawait
expressions or perform asynchronous operations. Converting it toasync
unnecessarily may introduce overhead and confusion. Consider keeping it as a synchronous method unless there is a plan to include asynchronous logic.🧰 Tools
🪛 Gitleaks
11-11: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
Line range hint
55-62
: Add return type annotation toget_token
Adding a return type annotation to
get_token
enhances readability and assists with static type checking. Example:async def get_token(self, client_id, device_code) -> dict[str, Any]:
Line range hint
67-74
: Add return type annotation torefresh_token
For consistency and clarity, consider specifying the return type of
refresh_token
. Example:async def refresh_token(self, client_id, refresh_token) -> dict[str, Any]:
Line range hint
78-90
: Add return type annotation toauthorize
The
authorize
method returns different structures based on the response. Specifying the return type can improve understanding of the possible outcomes. Example:async def authorize(self, device_code) -> dict[str, Any]:
Line range hint
92-104
: Add return type annotation toadd_magnet_link
Including a return type annotation for
add_magnet_link
improves code readability. Example:async def add_magnet_link(self, magnet_link) -> dict[str, Any]:
Line range hint
109-117
: Add return type annotation toget_torrent_info
Specify the return type of
get_torrent_info
for better clarity. Example:async def get_torrent_info(self, torrent_id) -> dict[str, Any]:
Missing
await
for Asynchronous Method CallsSeveral calls to updated asynchronous methods are not preceded by
await
, which may lead toRuntimeWarning
orRuntimeError
. Please ensure that all calls to the following methods are properly awaited and within an asynchronous context:
streaming_providers/seedr/api.py
response = seedr.authorize(data.device_code)
streaming_providers/realdebrid/api.py
content=rd_client.get_device_code()
scrapers/torrent_info.py
torrent_info = torrent_info.get_torrent_info(prowlarr_data["infoUrl"], prowlarr_data["indexer"])
streaming_providers/debridlink/utils.py
torrent_info = dl_client.get_available_torrent(info_hash)
torrent_id = dl_client.add_magnet_link(magnet_link)
Please update these calls by adding the
await
keyword to ensure proper asynchronous execution.🔗 Analysis chain
Line range hint
142-153
: Verify that all callers of updated methods are nowasync
With the conversion of several methods to
async
, ensure that all code calling these methods has been updated to useawait
and is within an asynchronous context. Failure to do so can lead toRuntimeWarning
orRuntimeError
due to unawaited coroutines or callingasync
functions synchronously.Run the following script to find usages of the updated methods and verify that they are awaited:
This script searches for calls to the updated methods that are not preceded by
await
, indicating potential issues where the async methods are called without awaiting. Review any matches to ensure that calls to these methods are correctly awaited.🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify all calls to updated async methods are properly awaited. methods=( '_handle_service_specific_errors' 'initialize_headers' 'get_device_code' 'get_token' 'refresh_token' 'authorize' 'add_magnet_link' 'get_user_torrent_list' 'get_torrent_info' 'get_torrent_files_list' 'get_torrent_instant_availability' 'delete_torrent' 'disable_access_token' 'get_available_torrent' ) for method in "${methods[@]}"; do echo "Searching for calls to ${method}:" rg --type py -g '!streaming_providers/debridlink/client.py' --pcre2 "(?<!await\s)(\b$method\b)\s*\(" -A 2 -B 2 echo "" doneLength of output: 49438
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (17)
- streaming_providers/alldebrid/client.py (5 hunks)
- streaming_providers/alldebrid/utils.py (3 hunks)
- streaming_providers/debrid_client.py (1 hunks)
- streaming_providers/debridlink/api.py (1 hunks)
- streaming_providers/debridlink/client.py (11 hunks)
- streaming_providers/debridlink/utils.py (2 hunks)
- streaming_providers/offcloud/client.py (3 hunks)
- streaming_providers/offcloud/utils.py (1 hunks)
- streaming_providers/parser.py (1 hunks)
- streaming_providers/premiumize/api.py (1 hunks)
- streaming_providers/premiumize/client.py (5 hunks)
- streaming_providers/premiumize/utils.py (3 hunks)
- streaming_providers/realdebrid/api.py (1 hunks)
- streaming_providers/realdebrid/client.py (9 hunks)
- streaming_providers/realdebrid/utils.py (3 hunks)
- streaming_providers/torbox/client.py (4 hunks)
- streaming_providers/torbox/utils.py (1 hunks)
🧰 Additional context used
🪛 Ruff
streaming_providers/debrid_client.py
22-25: Use
contextlib.suppress(ProviderException)
instead oftry
-except
-pass
Replace with
contextlib.suppress(ProviderException)
(SIM105)
97-100: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
streaming_providers/realdebrid/utils.py
27-29: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
89-91: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
🔇 Additional comments (32)
streaming_providers/realdebrid/api.py (2)
21-23
:⚠️ Potential issueLGTM! Consider awaiting
authorize()
The migration to asynchronous context management is a good improvement for resource handling. However, there's a potential issue to address:
If
rd_client.authorize()
is an asynchronous method in the updatedRealDebrid
class, it should be awaited. Consider updating the code as follows:async with RealDebrid() as rd_client: - response = rd_client.authorize(data.device_code) + response = await rd_client.authorize(data.device_code) return JSONResponse(content=response, headers=const.NO_CACHE_HEADERS)To confirm if
authorize()
is indeed asynchronous, we can check its implementation:#!/bin/bash # Description: Check if authorize() in RealDebrid class is asynchronous ast-grep --lang python --pattern 'class RealDebrid: $$$ async def authorize($_): $$$ 'If this returns a match, we should update the code to await the call.
13-16
:⚠️ Potential issueLGTM! Consider awaiting
get_device_code()
The migration to asynchronous context management is a good improvement for resource handling. However, there's a potential issue to address:
If
rd_client.get_device_code()
is an asynchronous method in the updatedRealDebrid
class, it should be awaited. Consider updating the code as follows:async with RealDebrid() as rd_client: return JSONResponse( - content=rd_client.get_device_code(), headers=const.NO_CACHE_HEADERS + content=await rd_client.get_device_code(), headers=const.NO_CACHE_HEADERS )To confirm if
get_device_code()
is indeed asynchronous, we can check its implementation:If this returns a match, we should update the code to await the call.
streaming_providers/debridlink/api.py (2)
13-16
: Excellent implementation of async context managerThe transition to using an async context manager for the DebridLink client is a great improvement. This change:
- Ensures proper resource management
- Allows for non-blocking I/O operations, potentially improving performance
- Maintains backwards compatibility by keeping the function signature unchanged
The implementation correctly awaits the result of
dl_client.get_device_code()
and uses the constantNO_CACHE_HEADERS
, maintaining consistency with the previous version.
Line range hint
1-25
: Overall excellent transition to async implementationThe changes in this file successfully achieve the PR objective of migrating to async implementations. Both
get_device_code
andauthorize
functions have been updated to use asynchronous context managers, which should lead to improved performance for I/O-bound operations.The modifications maintain backwards compatibility and adhere to Python's best practices for async programming. These changes contribute positively to the overall goal of enhancing the efficiency and responsiveness of the streaming providers module.
Great job on this implementation!
streaming_providers/parser.py (1)
8-9
: LGTM! Verify usage across the codebase.The changes to make
filename
andfile_index
optional parameters (allowing None) improve the function's flexibility and align well with its existing logic. The function already has fallback mechanisms for when these parameters are not provided, so no additional changes to the function body are necessary.To ensure that this change doesn't introduce any issues, please verify that all calls to this function across the codebase are still valid. Run the following script to check the usage:
Review the results to ensure that all calls to this function are still valid with the new optional parameters.
✅ Verification successful
Verified: Usage of
select_file_index_from_torrent
remains consistent across the codebase.All existing calls to
select_file_index_from_torrent
continue to provide thefilename
andfile_index
parameters, ensuring compatibility with the updated function signature where these parameters are now optional. No issues were found during the verification process.🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Find all calls to select_file_index_from_torrent and verify they're still valid with the new signature. # Search for function calls rg --type python -A 5 'select_file_index_from_torrent\s*\(' # Search for imports of this function to catch cases where it might be renamed rg --type python 'from\s+streaming_providers\.parser\s+import.*select_file_index_from_torrent'Length of output: 220
Script:
#!/bin/bash # Description: Find all calls to select_file_index_from_torrent and verify they're still valid with the new signature. # Search for function calls in Python files using the correct type identifier rg --type py -A 5 'select_file_index_from_torrent\s*\(' # Search for imports of this function to catch cases where it might be renamed rg --type py 'from\s+streaming_providers\.parser\s+import.*select_file_index_from_torrent'Length of output: 3094
streaming_providers/premiumize/api.py (2)
26-26
: Verify the presence of 'access_token' in 'token_data'When accessing
token_data["access_token"]
, there's an assumption that this key exists. Ifaccess_token
is missing, aKeyError
will be raised. To make the code more robust, consider adding a check to ensure that theaccess_token
key is present.You can add a check like this:
access_token = token_data.get("access_token") if not access_token: return {"error": "Access token not found in token data"} token = pm_client.encode_token_data(access_token)
16-19
: Confirm that 'get_authorization_url' is synchronousWithin the asynchronous context manager,
get_authorization_url()
is called withoutawait
, implying it's a synchronous method. Ensure that this is intentional and that the method does not need to be awaited.If
get_authorization_url
is asynchronous, it should be awaited:authorization_url = await pm_client.get_authorization_url()Please verify the method's implementation to confirm its synchronous behavior.
streaming_providers/offcloud/utils.py (2)
54-61
:⚠️ Potential issueHandle exceptions instead of passing in
update_oc_cache_status
.Currently, any
ProviderException
is silently ignored, which could mask issues that should be addressed or logged. Consider logging the exception to aid in debugging and monitoring.Apply this diff to handle exceptions appropriately:
except ProviderException: - pass + logger.error("Failed to update cache status", exc_info=True)Likely invalid or redundant comment.
31-34
: Verify the correctness ofProviderException
usage.Ensure that the
ProviderException
class constructor accepts the arguments provided. Incorrect arguments may lead to runtime errors.Run the following script to check the
ProviderException
definition:✅ Verification successful
ProviderException usage is correct.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify the constructor signature of `ProviderException`. # Test: Search for the definition of `ProviderException`. Expect: Constructor accepts message and optional filename. rg --type python -A 5 'class ProviderException'Length of output: 80
Script:
#!/bin/bash # Description: Retry verifying the constructor signature of `ProviderException` with correct file type. # Test: Search for the definition of `ProviderException`. Expect: Constructor accepts message and optional filename. rg --type py -A 5 'class ProviderException'Length of output: 418
streaming_providers/alldebrid/utils.py (1)
1-1
: Importingasyncio
is appropriate for asynchronous operationsThe addition of
import asyncio
is necessary to support the asynchronous functions and concurrency in the module.streaming_providers/alldebrid/client.py (5)
82-83
: Good implementation of asynchronousget_user_torrent_list
The method
get_user_torrent_list
has been correctly converted to an asynchronous function and properly awaits the_make_request
call.
97-105
: Efficient use of async inget_available_torrent
In the
get_available_torrent
method, you're effectively using asynchronous operations by awaitingget_user_torrent_list
and iterating over the results correctly.
75-77
: Verify the request payload inadd_magnet_link
In the
add_magnet_link
method, you're sending data usingdata={"magnets[]": magnet_link}
. Confirm that the API expects the magnet link to be sent in this format and that using thedata
parameter is appropriate. If the API expects a JSON payload, you should use thejson
parameter instead.Run the following script to check the API documentation or existing code for the correct usage:
#!/bin/bash # Description: Verify the correct method to send magnet links to the API. # Assuming API documentation is available locally cat docs/alldebrid_api.md | grep -A5 'magnet/upload' # Alternatively, check other instances where _make_request is used for magnet uploads rg --type py '_make_request.*magnet/upload' -A5
39-40
: Confirm thatsuper()._make_request
is asynchronousEnsure that the
super()._make_request
method in the parentDebridClient
class is asynchronous. Callingawait
on a non-async method will result in a runtime error.Run the following script to check the parent class definition:
#!/bin/bash # Description: Verify that DebridClient._make_request is an async method. # Search for DebridClient class definition and _make_request method rg --type py -A5 'class DebridClient' streaming_providers/debrid_client.py # Look for 'async def _make_request' within DebridClient
28-32
: Ensure proper usage ofdata
andjson
parameters in_make_request
The
_make_request
method now includes bothdata
andjson
parameters. Be cautious to avoid conflicts, as passing both can lead to unexpected behavior. Ensure that when calling this method, only one of these parameters is used as appropriate for the request.Run the following script to check all calls to
_make_request
for proper usage:streaming_providers/premiumize/client.py (8)
Line range hint
47-58
: Async conversion ofget_token
is appropriate.The
get_token
method has been correctly converted to an asynchronous function, and the use ofawait
with_make_request
is appropriate.
60-65
: Async conversion ofadd_magnet_link
is appropriate.The
add_magnet_link
method is correctly updated to be asynchronous, and it appropriately usesawait
with_make_request
.
67-72
: Async conversion ofcreate_folder
is appropriate.The
create_folder
method has been properly converted to an asynchronous function, and the use ofawait
is correct.
74-75
: Async conversion ofget_transfer_list
is appropriate.The method
get_transfer_list
has been correctly updated to be asynchronous, ensuring non-blocking I/O operations.
89-93
: Async conversion ofget_folder_list
is appropriate.The
get_folder_list
method is correctly updated to be asynchronous, allowing for non-blocking retrieval of folder lists.
96-99
: Async conversion ofdelete_folder
is appropriate.The
delete_folder
method has been properly converted to an asynchronous function, and it usesawait
appropriately.
101-104
: Async conversion ofdelete_torrent
is appropriate.The
delete_torrent
method is correctly updated to be asynchronous, improving the responsiveness of deletion operations.
Line range hint
106-116
: Ensure robust error handling inget_torrent_instant_availability
.The method checks if
results.get("status") != "success"
and raises aProviderException
if necessary. This is good practice. Ensure that any other potential exceptions from_make_request
are also appropriately handled.streaming_providers/premiumize/utils.py (2)
104-113
:⚠️ Potential issueHandle exceptions explicitly instead of passing
The
except ProviderException: pass
block silently ignores exceptions, which can make debugging difficult and hide underlying issues. Consider logging the exception or handling specific error cases.Suggested change:
except ProviderException as e: - pass + logger.error(f"Error updating cache status: {e}")Likely invalid or redundant comment.
124-130
:⚠️ Potential issueAvoid silent exception handling
In the
except ProviderException: return []
block, exceptions are caught and an empty list is returned without any logging. This could make diagnosing issues harder. Consider logging the exception before returning.Suggested change:
except ProviderException as e: + logger.error(f"Error fetching downloaded info hashes: {e}") return []
Likely invalid or redundant comment.
streaming_providers/offcloud/client.py (1)
122-127
:⚠️ Potential issueEnsure type consistency when matching episodes
The comparison
if episode in PTT.parse_title(link).get("episodes", [])
assumes that bothepisode
and the items inepisodes
are of the same type. Verify thatPTT.parse_title(link).get("episodes", [])
returns a list of integers to match withepisode: Optional[int]
. Ifepisodes
contains strings, you may need to convert them to integers before comparison.Run the following script to check the types of episodes returned:
streaming_providers/realdebrid/client.py (2)
42-57
: Verify Asynchronous Behavior ofsuper()._make_request
In the overridden
_make_request
method, you're callingawait super()._make_request(...)
. Please ensure that theDebridClient
's_make_request
method is asynchronous. If it's synchronous, this will raise aTypeError
at runtime.Confirm that
DebridClient
's_make_request
method is defined asasync def
.
15-18
: Ensure Compatibility of Asynchronous Methods with Base ClassThe methods
_handle_service_specific_errors
and_make_request
have been converted to asynchronous functions. Please verify that the base classDebridClient
methods are also asynchronous. IfDebridClient
has synchronous versions of these methods, this could lead to issues when usingawait
with them.Run the following script to check if the base class methods are asynchronous:
streaming_providers/torbox/client.py (3)
19-27
: Ensure compatibility of type hints with supported Python versionsThe type hints use the union operator (
|
), such asOptional[dict | str]
, which requires Python 3.10 or later. If your project needs to support earlier Python versions (e.g., 3.9 or below), consider usingUnion
from thetyping
module instead:from typing import Union, Optional # Example: data: Optional[Union[dict, str]] = None
19-32
: LGTM—Asynchronous request handling is implemented correctlyThe
_make_request
method has been correctly updated to handle asynchronous operations. The addition of thejson
parameter allows for flexibility in sending JSON payloads. The use ofawait
ensures that the asynchronous requests are properly awaited.
13-14
: Ensure proper implementation of asynchronous context managementThe method
__aexit__
has been implemented to support asynchronous context management. Please verify that the__aenter__
method is also implemented so that theTorbox
class can be used as an asynchronous context manager.Run the following script to verify the implementation of
__aenter__
:streaming_providers/debridlink/client.py (1)
2-2
: ImportingAny
andOptional
for type annotationsThe addition of
Any
andOptional
from thetyping
module is appropriate for the type hints used in the code.
async with Premiumize() as pm_client: | ||
return RedirectResponse( | ||
pm_client.get_authorization_url(), headers=const.NO_CACHE_HEADERS | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add exception handling for the asynchronous context
While using the asynchronous context manager with Premiumize
, any exceptions that occur within the async with
block may lead to unhandled exceptions. Consider adding error handling to manage potential exceptions gracefully and provide meaningful feedback to the user.
Here's an example of how you might implement this:
async with Premiumize() as pm_client:
try:
authorization_url = pm_client.get_authorization_url()
return RedirectResponse(authorization_url, headers=const.NO_CACHE_HEADERS)
except Exception as e:
# Handle specific exceptions as needed
return {"error": str(e)}
async with Premiumize() as pm_client: | ||
token_data = await pm_client.get_token(code) | ||
token = pm_client.encode_token_data(token_data["access_token"]) | ||
user_data = schemas.UserData( | ||
streaming_provider=schemas.StreamingProvider( | ||
service="premiumize", token=token | ||
) | ||
) | ||
encrypted_str = crypto.encrypt_user_data(user_data) | ||
return RedirectResponse( | ||
f"/{encrypted_str}/configure", headers=const.NO_CACHE_HEADERS | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Handle potential exceptions during token retrieval and user data processing
In the oauth2_redirect
function, exceptions might occur during the asynchronous operations, such as retrieving the token or processing user data. It's important to handle these exceptions to prevent unhandled errors and to provide clear error messages to the user.
Consider wrapping the operations in a try-except block:
async with Premiumize() as pm_client:
try:
token_data = await pm_client.get_token(code)
token = pm_client.encode_token_data(token_data["access_token"])
user_data = schemas.UserData(
streaming_provider=schemas.StreamingProvider(
service="premiumize", token=token
)
)
encrypted_str = crypto.encrypt_user_data(user_data)
return RedirectResponse(
f"/{encrypted_str}/configure", headers=const.NO_CACHE_HEADERS
)
except Exception as e:
# Handle specific exceptions as needed
return {"error": str(e)}
torrent_info = await get_torrent_info(ad_client, info_hash) | ||
if not torrent_info: | ||
torrent_id = await add_new_torrent(ad_client, magnet_link) | ||
else: | ||
torrent_id = torrent_info.get("id") | ||
|
||
return await wait_for_download_and_get_link( | ||
ad_client, | ||
torrent_id, | ||
filename, | ||
file_index, | ||
episode, | ||
max_retries, | ||
retry_interval, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Ensure proper exception handling within get_video_url_from_alldebrid
Consider adding exception handling to manage potential errors from the asynchronous calls to get_torrent_info
, add_new_torrent
, and wait_for_download_and_get_link
. This will improve the robustness of the function.
Example of adding exception handling:
async with AllDebrid(
token=user_data.streaming_provider.token, user_ip=user_ip
) as ad_client:
try:
torrent_info = await get_torrent_info(ad_client, info_hash)
if not torrent_info:
torrent_id = await add_new_torrent(ad_client, magnet_link)
else:
torrent_id = torrent_info.get("id")
return await wait_for_download_and_get_link(
ad_client,
torrent_id,
filename,
file_index,
episode,
max_retries,
retry_interval,
)
except ProviderException as e:
# Handle specific provider exceptions if necessary
raise e
except Exception as e:
# Handle generic exceptions
raise ProviderException("An error occurred while fetching the video URL") from e
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 13
🧹 Outside diff range and nitpick comments (9)
streaming_providers/parser.py (4)
27-28
: Improved file selection logicThe addition of the
is_video_file
check is a good improvement. It ensures that only video files are selected when a specific file index is provided.However, we can simplify the code structure slightly:
Consider combining the two conditions into a single
if
statement for better readability:if file_index is not None and file_index < len(files) and is_video_file(files[file_index][name_key]): return file_indexThis change addresses the static analysis suggestion (SIM102) and makes the code more concise.
40-46
: Improved episode matching and error handlingThe changes in the episode matching logic are beneficial:
- The addition of the
is_video_file
check ensures consistency with other selection methods.- The new
ProviderException
improves error handling when no matching file is found.These modifications enhance the robustness of the function.
Consider making the error message more specific:
raise ProviderException( f"No matching video file found for episode {episode}", "no_matching_file.mp4" )This change provides more context about why no matching file was found.
54-60
: Enhanced largest file selection and error handlingThe changes in the largest file selection logic are beneficial:
- The addition of the
is_video_file
check ensures that only video files are considered as valid selections.- The new
ProviderException
improves error handling when no suitable video file is found.These modifications enhance the robustness and consistency of the function.
Consider making the error message more specific:
raise ProviderException( "No suitable video file found in the torrent", "no_matching_file.mp4" )This change provides more context about why no matching file was found.
Line range hint
1-60
: Overall assessment: Significant improvements to file selection logicThe changes made to the
select_file_index_from_torrent
function represent a substantial improvement in its functionality, flexibility, and error handling:
- The function now consistently checks for video files across all selection methods.
- Error handling has been enhanced with specific exceptions for better debugging and user feedback.
- The addition of the
file_size_callback
allows for more dynamic file size retrieval.- Optional parameters provide greater flexibility in function usage.
These modifications align well with the stated goal of improving performance and transitioning to more robust code. The function is now better equipped to handle various scenarios in torrent file selection, particularly focusing on video files.
As this function seems to be part of a larger transition towards asynchronous programming, consider whether this function might benefit from being asynchronous as well, especially if the
file_size_callback
oris_video_file
function might involve I/O operations.🧰 Tools
🪛 Ruff
26-27: Use a single
if
statement instead of nestedif
statements(SIM102)
streaming_providers/offcloud/client.py (2)
80-89
: LGTM with suggestion: Async conversion ofget_available_torrent
The
get_available_torrent
method has been correctly converted to an async function, properly usingawait
for theget_user_torrent_list
call. The use ofnext()
with a generator expression is an efficient approach.However, to improve robustness, consider using the
get()
method when accessing the 'originalLink' key to prevent potential KeyError:return next( ( torrent for torrent in available_torrents - if info_hash.casefold() in torrent["originalLink"].casefold() + if info_hash.casefold() in torrent.get("originalLink", "").casefold() ), None, )This change will handle cases where a torrent dictionary might not have the 'originalLink' key.
94-103
: LGTM with suggestion: Efficient implementation ofget_file_sizes
The new
get_file_sizes
static method is an efficient implementation usinghttpx.AsyncClient
andasyncio.gather
for concurrent HTTP HEAD requests. This approach aligns well with the async migration objective and should improve performance when dealing with multiple files.However, to improve robustness, consider adding error handling for cases where the 'Content-Length' header might be missing or invalid:
return [ - {**file_data, "size": int(response.headers["Content-Length"])} + {**file_data, "size": int(response.headers.get("Content-Length", 0))} for file_data, response in zip(files_data, file_sizes) ]This change will default to a size of 0 if the 'Content-Length' header is missing, preventing potential KeyError or ValueError exceptions.
streaming_providers/torbox/client.py (1)
Line range hint
82-94
: Enhance success condition checking increate_download_link
In
create_download_link
, the success condition checks if"successfully"
is inresponse.get("detail")
. This approach may not be robust if thedetail
key is missing or its content changes. Consider checking for a specific status code, a success flag, or handling exceptions to reliably determine if the operation was successful.Update the success check for a more reliable condition. For example:
if response.get("status") == "success": return response raise ProviderException( f"Failed to create download link from Torbox {response}", "transfer_error.mp4", )streaming_providers/alldebrid/utils.py (2)
Line range hint
28-44
: Ensure safe access to torrent links inwait_for_download_and_get_link
When accessing
torrent_info["links"][file_index]["link"]
, there's a risk ofKeyError
orIndexError
if thelinks
list or the specifiedfile_index
does not exist. Consider adding checks or exception handling to validate the presence of these elements.Apply this diff to safely access torrent links:
file_index = select_file_index_from_torrent( torrent_info, filename, file_index, episode, file_key="links", name_key="filename", ) - response = await ad_client.create_download_link( - torrent_info["links"][file_index]["link"] - ) + try: + link = torrent_info["links"][file_index]["link"] + except (KeyError, IndexError): + raise ProviderException("File link not found in torrent info") + response = await ad_client.create_download_link(link) return response["data"]["link"]
Line range hint
48-77
: Add exception handling inget_video_url_from_alldebrid
To improve robustness, consider adding exception handling around the asynchronous calls within
get_video_url_from_alldebrid
. This will help manage potential errors fromget_torrent_info
,add_new_torrent
, andwait_for_download_and_get_link
, providing clearer error messages to the user.Apply this diff to include exception handling:
async def get_video_url_from_alldebrid( info_hash: str, magnet_link: str, user_data: UserData, filename: str, user_ip: str, file_index: int, episode: int = None, max_retries=5, retry_interval=5, **kwargs, ) -> str: async with AllDebrid( token=user_data.streaming_provider.token, user_ip=user_ip ) as ad_client: + try: torrent_info = await get_torrent_info(ad_client, info_hash) if not torrent_info: torrent_id = await add_new_torrent(ad_client, magnet_link) else: torrent_id = torrent_info.get("id") return await wait_for_download_and_get_link( ad_client, torrent_id, filename, file_index, episode, max_retries, retry_interval, ) + except ProviderException as e: + # Handle specific provider exceptions if necessary + raise e + except Exception as e: + # Handle generic exceptions + raise ProviderException("An error occurred while fetching the video URL") from e
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (6)
- streaming_providers/alldebrid/utils.py (3 hunks)
- streaming_providers/debrid_client.py (1 hunks)
- streaming_providers/offcloud/client.py (2 hunks)
- streaming_providers/offcloud/utils.py (1 hunks)
- streaming_providers/parser.py (1 hunks)
- streaming_providers/torbox/client.py (4 hunks)
🧰 Additional context used
🪛 Ruff
streaming_providers/debrid_client.py
23-26: Use
contextlib.suppress(ProviderException)
instead oftry
-except
-pass
Replace with
contextlib.suppress(ProviderException)
(SIM105)
99-102: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
streaming_providers/parser.py
26-27: Use a single
if
statement instead of nestedif
statements(SIM102)
🔇 Additional comments (24)
streaming_providers/parser.py (3)
5-6
: Improved function signature and importsThe changes to the imports and function signature are well-thought-out:
- The new imports support the enhanced error handling and file validation.
- Making
filename
andfile_index
optional (usingOptional[str]
andOptional[int]
) increases the function's flexibility.- The addition of the
file_size_callback
parameter allows for dynamic file size retrieval when needed.These modifications improve the function's versatility and error handling capabilities.
Also applies to: 11-12, 19-19
34-34
: Enhanced filename matchingThe addition of the
is_video_file
check in the filename matching logic is a good improvement. It ensures that only video files are selected when matching by filename, which aligns with the function's purpose and prevents potential issues with selecting non-video files.
48-50
: Flexible file size retrievalThe addition of the
file_size_callback
logic is a good improvement. It allows for dynamic file size retrieval when needed, making the function more versatile. This can be particularly useful in scenarios where file sizes are not immediately available in the torrent info.streaming_providers/offcloud/utils.py (4)
10-46
: LGTM: Successful conversion to asyncThe function
get_video_url_from_offcloud
has been correctly converted to an asynchronous function. All calls to OffCloud client methods are properly awaited, and the async context manager is used correctly. The error handling and overall logic remain intact.
49-63
: LGTM: Proper async conversionThe
update_oc_cache_status
function has been successfully converted to an asynchronous function. The async context manager is used correctly, and the call toget_torrent_instant_availability
is properly awaited. The exception handling remains in place, maintaining the function's robustness.
66-79
: LGTM: Async conversion and past issue resolvedThe
fetch_downloaded_info_hashes_from_oc
function has been successfully converted to an asynchronous function. The async context manager is used correctly, and the call toget_user_torrent_list
is properly awaited.Additionally, the issue mentioned in the past review comment regarding incorrect access to
torrent
in the list comprehension has been addressed. The code now correctly accesses the"originalLink"
key of thetorrent
dictionary before splitting.
1-88
: Overall: Good async conversion with room for improvementThe file has been successfully converted to use async/await syntax, which should improve performance for I/O-bound operations. All functions have been properly updated to be asynchronous, and the OffCloud client is consistently used with async context managers.
However, there's still room for improvement in error handling, particularly in the
delete_all_torrents_from_oc
function. Addressing this will enhance the robustness of the code.Great job on the async conversion! Just a bit more attention to error handling will make this code even better.
streaming_providers/debrid_client.py (2)
1-15
: LGTM: Appropriate imports and initialization for async refactoringThe changes to the imports and class initialization are well-suited for the transition to asynchronous programming. The use of
httpx.AsyncClient
with a timeout of 18.0 seconds (as noted in the comment) is appropriate for async HTTP requests within the Stremio context.
104-131
: LGTM: Appropriate async conversions and abstract method declarationsThe changes to the remaining methods (
initialize_headers
,disable_access_token
,wait_for_status
, andget_torrent_info
) are well-implemented:
- All methods have been correctly converted to async.
wait_for_status
now usesasyncio.sleep
instead oftime.sleep
, which is appropriate for async code.- Abstract methods are correctly decorated with
@abstractmethod
.These changes are consistent with the overall async refactoring of the class and follow best practices for abstract base classes in Python.
streaming_providers/offcloud/client.py (9)
1-1
: LGTM: Appropriate imports for async operationsThe addition of
asyncio
andhttpx
imports is appropriate for the transition to asynchronous operations. These changes align well with the overall objective of migrating to async functions.Also applies to: 5-5
16-19
: Verify the intention of empty async methodsThe
initialize_headers
anddisable_access_token
methods have been converted to async but remain empty. Please confirm if this is intentional or if these methods require implementation.If implementation is needed, consider adding TODOs or raising
NotImplementedError
.
22-28
: LGTM: Proper error handling for specific HTTP status codesThe
_handle_service_specific_errors
method effectively manages HTTP 403 and 429 status codes, raising appropriateProviderException
instances with clear error messages. This implementation enhances error handling and user feedback.
30-46
: LGTM: Async conversion and parameter updatesThe
_make_request
method has been successfully converted to async and includes new parameters that enhance its flexibility. The implementation properly utilizes the async super() call.Consider using
Union
for better compatibilityThe return type annotation
dict | str
is only valid in Python 3.10+. For better compatibility with earlier Python versions, consider usingUnion[dict, str]
from thetyping
module.- from typing import Optional, List + from typing import Optional, List, Union async def _make_request( self, method: str, url: str, - data: Optional[dict | str] = None, + data: Optional[Union[dict, str]] = None, json: Optional[dict] = None, params: Optional[dict] = None, is_return_none: bool = False, is_expected_to_fail: bool = False, delete: bool = False, - ) -> dict | list: + ) -> Union[dict, list]:Clarify the usage of the
delete
parameterThe
delete
parameter is used to determine the base URL, which seems unconventional. Consider refactoring this to use separate methods or a more explicit approach for different base URLs.Can you provide more context on why the
delete
parameter is used to switch between base URLs?
Line range hint
48-63
: LGTM: Proper async conversion ofadd_magnet_link
The
add_magnet_link
method has been correctly converted to an async function. The implementation properly usesawait
for the_make_request
call, and the existing error handling logic has been preserved. These changes align well with the overall async migration objective.
65-77
: LGTM: Effective async conversion of multiple methodsThe
get_user_torrent_list
,get_torrent_info
, andget_torrent_instant_availability
methods have been successfully converted to async functions. Each method correctly usesawait
for the_make_request
calls, maintaining their original functionality while adapting to the new asynchronous paradigm.
91-92
: LGTM: Proper async conversion ofexplore_folder_links
The
explore_folder_links
method has been correctly converted to an async function. It properly usesawait
for the_make_request
call, maintaining its original functionality while adapting to the new asynchronous paradigm.
105-126
: LGTM: Effective async conversion ofcreate_download_link
The
create_download_link
method has been successfully converted to an async function. The implementation correctly usesawait
for theexplore_folder_links
call and appropriately integrates the newget_file_sizes
method as a callback. These changes maintain the original functionality while adapting to the asynchronous paradigm, potentially improving performance when dealing with multiple files.
128-131
:⚠️ Potential issueConsider using DELETE HTTP method for
delete_torrent
While the
delete_torrent
method has been correctly converted to an async function, it's using the GET HTTP method for a deletion operation. This is not RESTful and could potentially lead to unintended side effects.Consider changing the HTTP method to DELETE to properly represent the action:
async def delete_torrent(self, request_id: str) -> dict: return await self._make_request( - "GET", f"/cloud/remove/{request_id}", delete=True + "DELETE", f"/cloud/remove/{request_id}", delete=True )This change will make the API usage more semantically correct and align with RESTful principles.
Likely invalid or redundant comment.
streaming_providers/torbox/client.py (3)
66-72
:⚠️ Potential issueEnsure compatibility with Python versions below 3.9
The type annotation
list[str]
inget_torrent_instant_availability
requires Python 3.9 or newer. If the project supports earlier Python versions, consider importingList
fromtyping
and updating the annotation accordingly.To check the project's Python version compatibility, review the project's configuration files or documentation.
If compatibility with Python versions below 3.9 is needed, apply this diff:
+from typing import Any, Optional, List - async def get_torrent_instant_availability(self, torrent_hashes: list[str]): + async def get_torrent_instant_availability(self, torrent_hashes: List[str]):
74-78
:⚠️ Potential issueUse compatible type annotations for return types
The return type annotation
dict[str, Any] | None
inget_available_torrent
uses the union operator|
, which is available from Python 3.10 onwards. For compatibility with earlier Python versions, replace it withOptional[Dict[str, Any]]
.If supporting Python versions earlier than 3.10, apply this diff:
+from typing import Any, Optional, Dict - async def get_available_torrent(self, info_hash) -> dict[str, Any] | None: + async def get_available_torrent(self, info_hash) -> Optional[Dict[str, Any]]:
10-11
: Ensureself.token
is initialized before using ininitialize_headers
In
initialize_headers
,self.token
is used to set the authorization header. Ensure thatself.token
is properly initialized before this method is called to avoidAttributeError
.Check where
self.token
is set and ensure it is assigned beforeinitialize_headers
is invoked.Here is a shell script to search for assignments to
self.token
:✅ Verification successful
self.token
is properly initialized before use
self.token
is assigned instreaming_providers/debrid_client.py
.🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify that `self.token` is initialized before use. # Search for `self.token` assignments in the codebase. rg --python --pcre2 "(self\.token\s*=\s*.*)" streaming_providers/Length of output: 478
Script:
#!/bin/bash # Description: Verify that `self.token` is initialized before use. # Search for `self.token` assignments in Python files within the streaming_providers directory. rg "self\.token\s*=\s*.*" --type=python streaming_providers/Length of output: 93
Script:
#!/bin/bash # Description: Verify that `self.token` is initialized before use. # Search for `self.token` assignments in Python files within the streaming_providers directory. rg "self\.token\s*=\s*.*" streaming_providers/ --glob "*.py"Length of output: 126
streaming_providers/alldebrid/utils.py (3)
80-99
: Good use of safe access and asynchronous operations inupdate_ad_cache_status
The function correctly uses asynchronous context managers and safely accesses dictionary keys using the
.get()
method. This approach prevents potentialKeyError
exceptions and ensures efficient execution.
124-135
:⚠️ Potential issueEnsure safe access and handle exceptions in
delete_all_torrents_from_ad
Accessing
torrents["data"]["magnets"]
assumes the keys exist. Additionally, consider handling exceptions during individual torrent deletions so a failure in one does not prevent others from being deleted.Apply this diff to improve robustness:
async def delete_all_torrents_from_ad(user_data: UserData, user_ip: str, **kwargs): """Deletes all torrents from the AllDebrid account.""" async with AllDebrid( token=user_data.streaming_provider.token, user_ip=user_ip ) as ad_client: torrents = await ad_client.get_user_torrent_list() + magnets = torrents.get("data", {}).get("magnets", []) await asyncio.gather( *[ ad_client.delete_torrent(torrent["id"]) - for torrent in torrents["data"]["magnets"] + for torrent in magnets ], + return_exceptions=True, )Using
return_exceptions=True
allows all deletions to be attempted, collecting exceptions without stopping the entire process.Likely invalid or redundant comment.
105-118
:⚠️ Potential issueHandle missing keys when accessing nested data structures in
fetch_downloaded_info_hashes_from_ad
When accessing
available_torrents["data"]["magnets"]
, ensure that both"data"
and"magnets"
keys exist to preventKeyError
. Use the.get()
method or default values to safely access nested data.Apply this diff to include safe access:
async def fetch_downloaded_info_hashes_from_ad( user_data: UserData, user_ip: str, **kwargs ) -> list[str]: """Fetches the info_hashes of all torrents downloaded in the AllDebrid account.""" try: async with AllDebrid( token=user_data.streaming_provider.token, user_ip=user_ip ) as ad_client: available_torrents = await ad_client.get_user_torrent_list() - if not available_torrents.get("data"): + if not available_torrents.get("data") or not available_torrents["data"].get("magnets"): return [] return [ torrent["hash"] - for torrent in available_torrents["data"]["magnets"] + for torrent in available_torrents.get("data", {}).get("magnets", []) ] except ProviderException: return []Likely invalid or redundant comment.
streaming_providers/debrid_client.py
Outdated
@staticmethod | ||
def _parse_response(response, is_return_none): | ||
async def _parse_response(response: httpx.Response, is_return_none: bool): | ||
if is_return_none: | ||
return {} | ||
try: | ||
return response.json() | ||
except JSONDecodeError as error: | ||
except ValueError as error: | ||
raise ProviderException( | ||
f"Failed to parse response error: {error}. \nresponse: {response.text}", | ||
"api_error.mp4", | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Use exception chaining in _parse_response
The conversion of _parse_response
to a static async method is appropriate. However, the exception handling can be improved.
Use exception chaining to preserve the original exception context when raising a new exception. Apply this diff:
try:
return response.json()
except ValueError as error:
raise ProviderException(
f"Failed to parse response error: {error}. \nresponse: {response.text}",
"api_error.mp4",
- )
+ ) from error
This change will maintain the original exception context, making it easier to debug issues related to response parsing.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
@staticmethod | |
def _parse_response(response, is_return_none): | |
async def _parse_response(response: httpx.Response, is_return_none: bool): | |
if is_return_none: | |
return {} | |
try: | |
return response.json() | |
except JSONDecodeError as error: | |
except ValueError as error: | |
raise ProviderException( | |
f"Failed to parse response error: {error}. \nresponse: {response.text}", | |
"api_error.mp4", | |
) | |
@staticmethod | |
async def _parse_response(response: httpx.Response, is_return_none: bool): | |
if is_return_none: | |
return {} | |
try: | |
return response.json() | |
except ValueError as error: | |
raise ProviderException( | |
f"Failed to parse response error: {error}. \nresponse: {response.text}", | |
"api_error.mp4", | |
) from error |
🧰 Tools
🪛 Ruff
99-102: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
streaming_providers/debrid_client.py
Outdated
@staticmethod | ||
async def _handle_request_error(error: Exception, is_expected_to_fail: bool): | ||
if isinstance(error, httpx.TimeoutException): | ||
raise ProviderException("Request timed out.", "torrent_not_downloaded.mp4") | ||
except requests.exceptions.ConnectionError: | ||
elif isinstance(error, httpx.NetworkError): | ||
raise ProviderException( | ||
"Failed to connect to Debrid service.", "debrid_service_down_error.mp4" | ||
) | ||
elif not is_expected_to_fail: | ||
raise ProviderException(f"Request error: {str(error)}", "api_error.mp4") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use httpx.ConnectError instead of httpx.NetworkError
The conversion of _handle_request_error
to a static async method is appropriate. However, there's an issue with the exception handling.
httpx
does not have an exception named NetworkError
. Use httpx.ConnectError
to handle connection-related errors. Apply this diff:
elif isinstance(error, httpx.TimeoutException):
raise ProviderException("Request timed out.", "torrent_not_downloaded.mp4")
- elif isinstance(error, httpx.NetworkError):
+ elif isinstance(error, httpx.ConnectError):
raise ProviderException(
"Failed to connect to Debrid service.", "debrid_service_down_error.mp4"
)
This change will ensure that the correct exception is caught for connection-related errors.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
@staticmethod | |
async def _handle_request_error(error: Exception, is_expected_to_fail: bool): | |
if isinstance(error, httpx.TimeoutException): | |
raise ProviderException("Request timed out.", "torrent_not_downloaded.mp4") | |
except requests.exceptions.ConnectionError: | |
elif isinstance(error, httpx.NetworkError): | |
raise ProviderException( | |
"Failed to connect to Debrid service.", "debrid_service_down_error.mp4" | |
) | |
elif not is_expected_to_fail: | |
raise ProviderException(f"Request error: {str(error)}", "api_error.mp4") | |
@staticmethod | |
async def _handle_request_error(error: Exception, is_expected_to_fail: bool): | |
if isinstance(error, httpx.TimeoutException): | |
raise ProviderException("Request timed out.", "torrent_not_downloaded.mp4") | |
elif isinstance(error, httpx.ConnectError): | |
raise ProviderException( | |
"Failed to connect to Debrid service.", "debrid_service_down_error.mp4" | |
) | |
elif not is_expected_to_fail: | |
raise ProviderException(f"Request error: {str(error)}", "api_error.mp4") |
streaming_providers/debrid_client.py
Outdated
async def _make_request( | ||
self, | ||
method: str, | ||
url: str, | ||
data=None, | ||
params=None, | ||
is_return_none=False, | ||
is_expected_to_fail=False, | ||
) -> dict: | ||
response = self._perform_request(method, url, data, params) | ||
self._handle_errors(response, is_expected_to_fail) | ||
return self._parse_response(response, is_return_none) | ||
|
||
def _perform_request(self, method, url, data, params): | ||
data: Optional[dict | str] = None, | ||
json: Optional[dict] = None, | ||
params: Optional[dict] = None, | ||
is_return_none: bool = False, | ||
is_expected_to_fail: bool = False, | ||
) -> dict | list: | ||
try: | ||
return requests.request( | ||
method, | ||
url, | ||
params=params, | ||
data=data, | ||
headers=self.headers, | ||
timeout=const.DEBRID_SERVER_TIMEOUT, | ||
response = await self.client.request( | ||
method, url, data=data, json=json, params=params, headers=self.headers | ||
) | ||
except requests.exceptions.Timeout: | ||
response.raise_for_status() | ||
return await self._parse_response(response, is_return_none) | ||
except httpx.RequestError as error: | ||
await self._handle_request_error(error, is_expected_to_fail) | ||
except httpx.HTTPStatusError as error: | ||
await self._handle_http_error(error, is_expected_to_fail) | ||
except Exception as error: | ||
await self._handle_request_error(error, is_expected_to_fail) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove broad Exception catch in _make_request
The conversion of _make_request
to an async method and the use of self.client.request
are appropriate. The error handling has been improved to use specific httpx
exceptions. However, there's still a broad Exception
catch at the end of the try-except block.
Remove the broad Exception catch to prevent masking unexpected errors and to make debugging easier. Apply this diff:
except httpx.RequestError as error:
await self._handle_request_error(error, is_expected_to_fail)
except httpx.HTTPStatusError as error:
await self._handle_http_error(error, is_expected_to_fail)
- except Exception as error:
- await self._handle_request_error(error, is_expected_to_fail)
This change will ensure that only expected exceptions are caught and handled, while allowing unexpected errors to propagate for easier debugging.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async def _make_request( | |
self, | |
method: str, | |
url: str, | |
data=None, | |
params=None, | |
is_return_none=False, | |
is_expected_to_fail=False, | |
) -> dict: | |
response = self._perform_request(method, url, data, params) | |
self._handle_errors(response, is_expected_to_fail) | |
return self._parse_response(response, is_return_none) | |
def _perform_request(self, method, url, data, params): | |
data: Optional[dict | str] = None, | |
json: Optional[dict] = None, | |
params: Optional[dict] = None, | |
is_return_none: bool = False, | |
is_expected_to_fail: bool = False, | |
) -> dict | list: | |
try: | |
return requests.request( | |
method, | |
url, | |
params=params, | |
data=data, | |
headers=self.headers, | |
timeout=const.DEBRID_SERVER_TIMEOUT, | |
response = await self.client.request( | |
method, url, data=data, json=json, params=params, headers=self.headers | |
) | |
except requests.exceptions.Timeout: | |
response.raise_for_status() | |
return await self._parse_response(response, is_return_none) | |
except httpx.RequestError as error: | |
await self._handle_request_error(error, is_expected_to_fail) | |
except httpx.HTTPStatusError as error: | |
await self._handle_http_error(error, is_expected_to_fail) | |
except Exception as error: | |
await self._handle_request_error(error, is_expected_to_fail) | |
async def _make_request( | |
self, | |
method: str, | |
url: str, | |
data: Optional[dict | str] = None, | |
json: Optional[dict] = None, | |
params: Optional[dict] = None, | |
is_return_none: bool = False, | |
is_expected_to_fail: bool = False, | |
) -> dict | list: | |
try: | |
response = await self.client.request( | |
method, url, data=data, json=json, params=params, headers=self.headers | |
) | |
response.raise_for_status() | |
return await self._parse_response(response, is_return_none) | |
except httpx.RequestError as error: | |
await self._handle_request_error(error, is_expected_to_fail) | |
except httpx.HTTPStatusError as error: | |
await self._handle_http_error(error, is_expected_to_fail) |
streaming_providers/debrid_client.py
Outdated
async def _handle_http_error( | ||
self, error: httpx.HTTPStatusError, is_expected_to_fail: bool | ||
): | ||
if error.response.status_code in [502, 503, 504]: | ||
raise ProviderException( | ||
"Debrid service is down.", "debrid_service_down_error.mp4" | ||
) | ||
|
||
if is_expected_to_fail: | ||
return | ||
self._handle_service_specific_errors(error) | ||
if is_expected_to_fail: | ||
return | ||
|
||
if error.response.status_code == 401: | ||
raise ProviderException("Invalid token", "invalid_token.mp4") | ||
await self._handle_service_specific_errors(error) | ||
|
||
formatted_traceback = "".join(traceback.format_exception(error)) | ||
raise ProviderException( | ||
f"API Error {error.response.text} \n{formatted_traceback}", | ||
"api_error.mp4", | ||
) | ||
if error.response.status_code == 401: | ||
raise ProviderException("Invalid token", "invalid_token.mp4") | ||
|
||
formatted_traceback = "".join(traceback.format_exception(error)) | ||
raise ProviderException( | ||
f"API Error {error.response.text} \n{formatted_traceback}", | ||
"api_error.mp4", | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct usage of traceback.format_exception
The conversion of _handle_http_error
to an async method is correct. However, there's an issue with the usage of traceback.format_exception
.
traceback.format_exception
requires the exception type, value, and traceback as arguments. Passing only the exception instance may not produce the desired traceback information. Apply this diff:
- formatted_traceback = "".join(traceback.format_exception(error))
+ formatted_traceback = "".join(
+ traceback.format_exception(type(error), error, error.__traceback__)
+ )
raise ProviderException(
f"API Error {error.response.text} \n{formatted_traceback}",
"api_error.mp4",
)
This change will ensure that the full traceback information is included in the exception message.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async def _handle_http_error( | |
self, error: httpx.HTTPStatusError, is_expected_to_fail: bool | |
): | |
if error.response.status_code in [502, 503, 504]: | |
raise ProviderException( | |
"Debrid service is down.", "debrid_service_down_error.mp4" | |
) | |
if is_expected_to_fail: | |
return | |
self._handle_service_specific_errors(error) | |
if is_expected_to_fail: | |
return | |
if error.response.status_code == 401: | |
raise ProviderException("Invalid token", "invalid_token.mp4") | |
await self._handle_service_specific_errors(error) | |
formatted_traceback = "".join(traceback.format_exception(error)) | |
raise ProviderException( | |
f"API Error {error.response.text} \n{formatted_traceback}", | |
"api_error.mp4", | |
) | |
if error.response.status_code == 401: | |
raise ProviderException("Invalid token", "invalid_token.mp4") | |
formatted_traceback = "".join(traceback.format_exception(error)) | |
raise ProviderException( | |
f"API Error {error.response.text} \n{formatted_traceback}", | |
"api_error.mp4", | |
) | |
async def _handle_http_error( | |
self, error: httpx.HTTPStatusError, is_expected_to_fail: bool | |
): | |
if error.response.status_code in [502, 503, 504]: | |
raise ProviderException( | |
"Debrid service is down.", "debrid_service_down_error.mp4" | |
) | |
if is_expected_to_fail: | |
return | |
await self._handle_service_specific_errors(error) | |
if error.response.status_code == 401: | |
raise ProviderException("Invalid token", "invalid_token.mp4") | |
formatted_traceback = "".join( | |
traceback.format_exception(type(error), error, error.__traceback__) | |
) | |
raise ProviderException( | |
f"API Error {error.response.text} \n{formatted_traceback}", | |
"api_error.mp4", | |
) |
async def get_torrent_info(ad_client, info_hash): | ||
torrent_info = await ad_client.get_available_torrent(info_hash) | ||
if torrent_info and torrent_info["status"] == "Ready": | ||
return torrent_info | ||
elif torrent_info and torrent_info["statusCode"] == 7: | ||
ad_client.delete_torrent(torrent_info.get("id")) | ||
await ad_client.delete_torrent(torrent_info.get("id")) | ||
raise ProviderException( | ||
"Not enough seeders available for parse magnet link", | ||
"Not enough seeders available to parse magnet link", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ensure safe access to dictionary keys to prevent KeyError
In get_torrent_info
, accessing torrent_info["status"]
, torrent_info["statusCode"]
, and torrent_info.get("id")
assumes these keys always exist. To prevent potential KeyError
, consider using the .get()
method or adding checks to ensure these keys are present before accessing them.
Apply this diff to safely access dictionary keys:
async def get_torrent_info(ad_client, info_hash):
torrent_info = await ad_client.get_available_torrent(info_hash)
- if torrent_info and torrent_info["status"] == "Ready":
+ if torrent_info and torrent_info.get("status") == "Ready":
return torrent_info
- elif torrent_info and torrent_info["statusCode"] == 7:
+ elif torrent_info and torrent_info.get("statusCode") == 7:
await ad_client.delete_torrent(torrent_info.get("id"))
raise ProviderException(
"Not enough seeders available to parse magnet link",
"transfer_error.mp4",
)
return None
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async def get_torrent_info(ad_client, info_hash): | |
torrent_info = await ad_client.get_available_torrent(info_hash) | |
if torrent_info and torrent_info["status"] == "Ready": | |
return torrent_info | |
elif torrent_info and torrent_info["statusCode"] == 7: | |
ad_client.delete_torrent(torrent_info.get("id")) | |
await ad_client.delete_torrent(torrent_info.get("id")) | |
raise ProviderException( | |
"Not enough seeders available for parse magnet link", | |
"Not enough seeders available to parse magnet link", | |
async def get_torrent_info(ad_client, info_hash): | |
torrent_info = await ad_client.get_available_torrent(info_hash) | |
if torrent_info and torrent_info.get("status") == "Ready": | |
return torrent_info | |
elif torrent_info and torrent_info.get("statusCode") == 7: | |
await ad_client.delete_torrent(torrent_info.get("id")) | |
raise ProviderException( | |
"Not enough seeders available to parse magnet link", | |
"transfer_error.mp4", | |
) | |
return None |
…remove index base file finding due to not trustable
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 20
🧹 Outside diff range and nitpick comments (7)
streaming_providers/parser.py (2)
38-40
: Consider consolidating error messages.The error messages for both cases (no matching file for a specific episode and no matching video file in general) are identical. Consider differentiating these messages to provide more specific information about why no matching file was found.
Here's a suggestion:
if episode: raise ProviderException( f"No matching video file found for episode {episode}", "no_matching_file.mp4" ) # ... (rest of the function) raise ProviderException( "No matching video file found in the torrent", "no_matching_file.mp4" )Also applies to: 52-54
77-78
: Consider adding error handling for database operations.The function performs database operations (saving
torrent_stream
) without any error handling. Consider wrapping these operations in try-except blocks to handle potential database errors gracefully.Here's a suggestion:
try: await torrent_stream.save() logging.info(f"Updated {torrent_stream.id} metadata") except Exception as e: logging.error(f"Failed to update {torrent_stream.id} metadata: {str(e)}") # Consider re-raising the exception or handling it based on your error management strategyAlso applies to: 115-116
streaming_providers/torbox/utils.py (1)
63-77
: LGTM: New async functionupdate_chunk_cache_status
implemented correctlyThe new async function
update_chunk_cache_status
is well-implemented with proper error handling and async/await syntax.Consider making the boolean conversion more explicit for better readability:
stream.cached = stream.id in instant_availability_dataThis change makes the intention clearer without changing the functionality.
streaming_providers/offcloud/client.py (3)
26-32
: LGTM: Improved error handling with specific exceptions.The
_handle_service_specific_errors
method has been properly converted to async and now handles specific HTTP status codes (403 and 429) with appropriateProviderException
raising. The error messages are descriptive and include relevant video file names.Consider adding a default case to handle unexpected errors:
else: raise ProviderException(f"Unexpected error: {error}", "general_error.mp4")This would ensure all potential errors are caught and handled consistently.
34-49
: LGTM: Method successfully converted to async with improved flexibility.The
_make_request
method has been properly converted to async and now handles both BASE_URL and DELETE_URL. The parameter updates and superclass method call are correct.However, the return type hint
dict | list
is only valid in Python 3.10+. For better compatibility, consider using:from typing import Union # ... async def _make_request( # ... ) -> Union[dict, list]: # ...This change ensures compatibility with earlier Python versions while maintaining type hinting.
109-144
: LGTM: Method successfully converted to async with improved functionality.The
create_download_link
method has been properly converted to async and now handles more complex scenarios, including directory exploration and file selection. The addition of new parameters and the use ofbackground_tasks
for updating metadata demonstrate good practices for asynchronous operations.Consider adding error handling for the case where
file_index
is out of range:if file_index >= len(links): raise ProviderException("Selected file not found in torrent", "file_not_found.mp4")This would prevent potential
IndexError
and provide a more informative error message.streaming_providers/debridlink/utils.py (1)
22-31
: Ensure consistent parameter ordering and type annotations.In
get_video_url_from_debridlink
, consider grouping related parameters together, such as placingepisode
next tofilename
. Verify that all parameters have accurate type annotations to enhance readability and maintainability.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (10)
- streaming_providers/alldebrid/utils.py (1 hunks)
- streaming_providers/debrid_client.py (1 hunks)
- streaming_providers/debridlink/utils.py (1 hunks)
- streaming_providers/offcloud/client.py (2 hunks)
- streaming_providers/offcloud/utils.py (1 hunks)
- streaming_providers/parser.py (1 hunks)
- streaming_providers/pikpak/utils.py (3 hunks)
- streaming_providers/realdebrid/utils.py (1 hunks)
- streaming_providers/routes.py (5 hunks)
- streaming_providers/torbox/utils.py (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- streaming_providers/alldebrid/utils.py
🧰 Additional context used
🪛 Ruff
streaming_providers/debrid_client.py
23-26: Use
contextlib.suppress(ProviderException)
instead oftry
-except
-pass
Replace with
contextlib.suppress(ProviderException)
(SIM105)
99-102: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
122-123: Use a single
if
statement instead of nestedif
statementsCombine
if
statements usingand
(SIM102)
streaming_providers/realdebrid/utils.py
131-133: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
🔇 Additional comments (27)
streaming_providers/offcloud/utils.py (3)
12-63
: Excellent async conversion and improved functionality.The
get_video_url_from_offcloud
function has been successfully converted to an asynchronous function. The changes include:
- Proper use of
async/await
syntax.- Utilization of async context manager for the
OffCloud
client.- Addition of
stream
andbackground_tasks
parameters for better integration.- Improved error handling for the "error" status.
These changes align well with the PR objectives and should contribute to improved performance.
66-79
: Async conversion looks good.The
update_oc_cache_status
function has been properly converted to an asynchronous function. The changes include:
- Correct use of
async/await
syntax.- Utilization of async context manager for the
OffCloud
client.- Awaiting the call to
get_torrent_instant_availability
.The error handling remains in place, maintaining the function's robustness.
83-96
: Async conversion and past issue resolution look good.The
fetch_downloaded_info_hashes_from_oc
function has been successfully converted to an asynchronous function. The changes include:
- Proper use of
async/await
syntax.- Utilization of async context manager for the
OffCloud
client.- Awaiting the call to
get_user_torrent_list
.Additionally, the issue mentioned in the past review comment about incorrect access to
torrent
in the list comprehension has been addressed. The code now correctly accesses the"originalLink"
key of thetorrent
dictionary before splitting.streaming_providers/parser.py (3)
13-20
: LGTM: Improved function signature and new parameter.The changes to the function signature, including making
filename
optional and adding thefile_size_callback
parameter, improve the flexibility of the function. This allows for better handling of various scenarios and provider-specific requirements.
57-117
: LGTM: Well-structured async function for updating torrent metadata.The new
update_torrent_streams_metadata
function is well-structured and properly utilizes async/await syntax. It handles both single episode and season updates, which provides good flexibility. The use of type hints, logging, and timestamp updates are all positive aspects.
1-117
: Overall, good improvements to torrent handling and metadata updating.The changes in this file significantly enhance the functionality and flexibility of torrent file selection and metadata updating. The
select_file_index_from_torrent
function now handles more scenarios, and the newupdate_torrent_streams_metadata
function provides a robust way to update torrent stream metadata asynchronously.Key points:
- The transition to async functions where appropriate is a positive step towards improving overall application performance.
- The added flexibility in file selection and metadata updating will likely improve the application's ability to handle various torrent structures.
- The suggested optimizations and error handling improvements, if implemented, will further enhance the robustness and efficiency of these functions.
Great work on these improvements! The changes align well with the PR's objective of migrating to async and improving performance.
streaming_providers/debrid_client.py (6)
1-12
: LGTM: Import statements and class definition are appropriate.The new imports (asyncio, traceback, abstractmethod, Optional, httpx) are suitable for the async conversion and type hinting. The class definition remains unchanged, which is correct.
12-15
: LGTM: Constructor updated correctly for async usage.The
__init__
method has been updated with proper type hinting for thetoken
parameter. The initialization ofhttpx.AsyncClient
with an 18-second timeout is appropriate, considering the comment about Stremio's 20-second timeout.
136-138
: LGTM: get_torrent_info method correctly updated.The
get_torrent_info
method has been properly converted to an async abstract method. This change is consistent with the overall async refactoring of the class.
1-138
: LGTM: Successful async conversion with minor suggestions.The conversion of the
DebridClient
class to use async programming is well-executed. The use ofhttpx.AsyncClient
, async context managers, and appropriate error handling for httpx exceptions are all positive changes. The suggestions provided in previous comments will further improve the code quality and robustness.Great job on this refactoring!
🧰 Tools
🪛 Ruff
23-26: Use
contextlib.suppress(ProviderException)
instead oftry
-except
-pass
Replace with
contextlib.suppress(ProviderException)
(SIM105)
99-102: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
122-123: Use a single
if
statement instead of nestedif
statementsCombine
if
statements usingand
(SIM102)
29-50
:⚠️ Potential issueRemove broad Exception catch in _make_request.
The conversion of
_make_request
to an async method and the use ofself.client.request
are appropriate. The error handling has been improved to use specifichttpx
exceptions. However, there's still a broadException
catch at the end of the try-except block.Remove the broad Exception catch to prevent masking unexpected errors and to make debugging easier. Apply this diff:
except httpx.RequestError as error: await self._handle_request_error(error, is_expected_to_fail) except httpx.HTTPStatusError as error: await self._handle_http_error(error, is_expected_to_fail) - except Exception as error: - await self._handle_request_error(error, is_expected_to_fail)This change will ensure that only expected exceptions are caught and handled, while allowing unexpected errors to propagate for easier debugging.
Likely invalid or redundant comment.
63-83
:⚠️ Potential issueCorrect usage of traceback.format_exception.
The conversion of
_handle_http_error
to an async method is correct. However, there's an issue with the usage oftraceback.format_exception
.
traceback.format_exception
requires the exception type, value, and traceback as arguments. Passing only the exception instance may not produce the desired traceback information. Apply this diff:- formatted_traceback = "".join(traceback.format_exception(error)) + formatted_traceback = "".join( + traceback.format_exception(type(error), error, error.__traceback__) + ) raise ProviderException( f"API Error {error.response.text} \n{formatted_traceback}", "api_error.mp4", )This change will ensure that the full traceback information is included in the exception message.
Likely invalid or redundant comment.
streaming_providers/torbox/utils.py (1)
57-60
: LGTM: Type hints added todivide_chunks
The addition of type hints to the
divide_chunks
function improves code readability and maintainability. The function logic remains correct and unchanged.streaming_providers/offcloud/client.py (9)
1-3
: LGTM: Necessary imports added for async functionality and type hinting.The new imports are appropriate for the changes made in the file, supporting the transition to async methods and improved type hinting.
5-5
: LGTM: Appropriate HTTP client for async operations.The import of
httpx
is suitable for handling asynchronous HTTP requests in this updated client.
7-13
: LGTM: Necessary imports added for updated functionality.The new imports provide the required models, base class, exception handling, and utility functions for the updated
OffCloud
class.
20-25
: LGTM: Methods correctly converted to async.The
initialize_headers
anddisable_access_token
methods have been properly converted to async, maintaining consistency with the base class implementation.
Line range hint
52-67
: LGTM: Method successfully converted to async with improved error handling.The
add_magnet_link
method has been properly converted to async and now includes comprehensive error handling. The use ofProviderException
with descriptive messages and video file names enhances the user experience in case of errors.
69-76
: LGTM: Methods successfully converted to async with improved type hinting and error handling.Both
get_user_torrent_list
andget_torrent_info
methods have been properly converted to async. The addition of return type hints enhances code readability and maintainability. Theget_torrent_info
method now safely handles potential missing data in the response, improving robustness.
78-93
: LGTM: Methods successfully converted to async with improved type hinting.Both
get_torrent_instant_availability
andget_available_torrent
methods have been properly converted to async with appropriate return type hints. The case-insensitive comparison inget_available_torrent
is a good practice.However, there's a potential
KeyError
inget_available_torrent
when accessingoriginalLink
. Consider using theget
method to safely access the key:if info_hash.casefold() in torrent.get("originalLink", "").casefold()This change will prevent a
KeyError
iforiginalLink
is missing from the torrent dictionary.
95-107
: LGTM: New async method added and existing method converted to async.The
explore_folder_links
method has been properly converted to async. The new static methodget_file_sizes
efficiently usesasyncio.gather
for concurrent requests, which is a good practice for improving performance.However, there's a potential
KeyError
when accessing theContent-Length
header. Consider adding error handling:return [ {**file_data, "size": int(response.headers.get("Content-Length", 0))} for file_data, response in zip(files_data, file_sizes) ]This change will use a default value of 0 if the
Content-Length
header is missing, preventing aKeyError
.
146-149
: Consider using the DELETE HTTP method for resource deletion.While the
delete_torrent
method has been properly converted to async, it's using a GET request for deletion. This is not RESTful and may lead to unintended side effects. Consider using the DELETE HTTP method instead:async def delete_torrent(self, request_id: str) -> dict: return await self._make_request( "DELETE", f"/cloud/remove/{request_id}", delete=True )This change aligns the method with RESTful principles and makes the intention of the operation clearer.
streaming_providers/pikpak/utils.py (3)
138-138
: LGTM: Improved file selection logicThe removal of the
file_index
parameter and the modification of theselect_file_index_from_torrent
function call enhance the flexibility of the file selection process. This change allows for dynamic file index determination based on the provided parameters, which is a more robust approach.
266-266
: LGTM: Consistent updates to function callsThe modifications to the
find_file_in_folder_tree
function calls inretrieve_or_download_file
are consistent with the earlier changes to the function signature. These updates ensure that the file retrieval process remains functional and aligned with the new implementation.Also applies to: 275-275
138-138
: Overall: Improved file selection and retrieval processThe changes made to the
find_file_in_folder_tree
function and its calls inretrieve_or_download_file
have successfully simplified the interface and enhanced the flexibility of the file selection process. These modifications contribute to a more robust and maintainable codebase by allowing dynamic file index determination based on provided parameters.The consistency of these changes across the file demonstrates a thorough approach to refactoring, ensuring that all related parts of the code are updated accordingly.
Also applies to: 266-266, 275-275
streaming_providers/routes.py (2)
Line range hint
89-116
: Ensure allget_video_url
functions acceptbackground_tasks
The addition of
background_tasks
toget_or_create_video_url
and its inclusion inkwargs
means that all functions inmapper.GET_VIDEO_URL_FUNCTIONS
must acceptbackground_tasks
as a parameter. Passing it via**kwargs
could lead toTypeError
exceptions at runtime if any of these functions do not expect this parameter.As a follow-up, please verify that all functions referenced in
GET_VIDEO_URL_FUNCTIONS
have been updated to acceptbackground_tasks
. This will ensure smooth execution without unexpected errors.
235-235
: Passbackground_tasks
consistently to all function callsWhen calling
get_or_create_video_url
, ensure thatbackground_tasks
is consistently passed and utilized within downstream functions. This consistency is crucial for the proper handling of asynchronous background tasks.Please check that all invocations of
get_or_create_video_url
throughout the codebase include thebackground_tasks
parameter and that it's appropriately used within those functions.
streaming_providers/debrid_client.py
Outdated
@staticmethod | ||
async def _handle_request_error(error: Exception, is_expected_to_fail: bool): | ||
if isinstance(error, httpx.TimeoutException): | ||
raise ProviderException("Request timed out.", "torrent_not_downloaded.mp4") | ||
except requests.exceptions.ConnectionError: | ||
elif isinstance(error, httpx.TransportError): | ||
raise ProviderException( | ||
"Failed to connect to Debrid service.", "debrid_service_down_error.mp4" | ||
) | ||
elif not is_expected_to_fail: | ||
raise ProviderException(f"Request error: {str(error)}", "api_error.mp4") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use httpx.ConnectError for more specific error handling.
The conversion of _handle_request_error
to a static async method is appropriate. However, we can improve the specificity of the exception handling.
Replace httpx.TransportError
with httpx.ConnectError
for more precise error handling. Apply this diff:
if isinstance(error, httpx.TimeoutException):
raise ProviderException("Request timed out.", "torrent_not_downloaded.mp4")
- elif isinstance(error, httpx.TransportError):
+ elif isinstance(error, httpx.ConnectError):
raise ProviderException(
"Failed to connect to Debrid service.", "debrid_service_down_error.mp4"
)
This change will ensure that the error handling is more specific to connection-related issues.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
@staticmethod | |
async def _handle_request_error(error: Exception, is_expected_to_fail: bool): | |
if isinstance(error, httpx.TimeoutException): | |
raise ProviderException("Request timed out.", "torrent_not_downloaded.mp4") | |
except requests.exceptions.ConnectionError: | |
elif isinstance(error, httpx.TransportError): | |
raise ProviderException( | |
"Failed to connect to Debrid service.", "debrid_service_down_error.mp4" | |
) | |
elif not is_expected_to_fail: | |
raise ProviderException(f"Request error: {str(error)}", "api_error.mp4") | |
@staticmethod | |
async def _handle_request_error(error: Exception, is_expected_to_fail: bool): | |
if isinstance(error, httpx.TimeoutException): | |
raise ProviderException("Request timed out.", "torrent_not_downloaded.mp4") | |
elif isinstance(error, httpx.ConnectError): | |
raise ProviderException( | |
"Failed to connect to Debrid service.", "debrid_service_down_error.mp4" | |
) | |
elif not is_expected_to_fail: | |
raise ProviderException(f"Request error: {str(error)}", "api_error.mp4") |
streaming_providers/debrid_client.py
Outdated
@abstractmethod | ||
async def _handle_service_specific_errors(self, error: httpx.HTTPStatusError): | ||
""" | ||
Service specific errors on api requests. | ||
""" | ||
raise NotImplementedError | ||
|
||
@staticmethod | ||
def _parse_response(response, is_return_none): | ||
async def _parse_response(response: httpx.Response, is_return_none: bool): | ||
if is_return_none: | ||
return {} | ||
try: | ||
return response.json() | ||
except JSONDecodeError as error: | ||
except ValueError as error: | ||
raise ProviderException( | ||
f"Failed to parse response error: {error}. \nresponse: {response.text}", | ||
"api_error.mp4", | ||
) | ||
|
||
def initialize_headers(self): | ||
@abstractmethod | ||
async def initialize_headers(self): | ||
raise NotImplementedError | ||
|
||
def disable_access_token(self): | ||
@abstractmethod | ||
async def disable_access_token(self): | ||
raise NotImplementedError |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Use exception chaining in _parse_response.
The conversion of these methods to async and the use of abstract methods where appropriate are correct. However, we can improve the exception handling in _parse_response
.
Use exception chaining to preserve the original exception context when raising a new exception. Apply this diff:
try:
return response.json()
except ValueError as error:
raise ProviderException(
f"Failed to parse response error: {error}. \nresponse: {response.text}",
"api_error.mp4",
- )
+ ) from error
This change will maintain the original exception context, making it easier to debug issues related to response parsing.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
@abstractmethod | |
async def _handle_service_specific_errors(self, error: httpx.HTTPStatusError): | |
""" | |
Service specific errors on api requests. | |
""" | |
raise NotImplementedError | |
@staticmethod | |
def _parse_response(response, is_return_none): | |
async def _parse_response(response: httpx.Response, is_return_none: bool): | |
if is_return_none: | |
return {} | |
try: | |
return response.json() | |
except JSONDecodeError as error: | |
except ValueError as error: | |
raise ProviderException( | |
f"Failed to parse response error: {error}. \nresponse: {response.text}", | |
"api_error.mp4", | |
) | |
def initialize_headers(self): | |
@abstractmethod | |
async def initialize_headers(self): | |
raise NotImplementedError | |
def disable_access_token(self): | |
@abstractmethod | |
async def disable_access_token(self): | |
raise NotImplementedError | |
@staticmethod | |
async def _parse_response(response: httpx.Response, is_return_none: bool): | |
if is_return_none: | |
return {} | |
try: | |
return response.json() | |
except ValueError as error: | |
raise ProviderException( | |
f"Failed to parse response error: {error}. \nresponse: {response.text}", | |
"api_error.mp4", | |
) from error | |
@abstractmethod | |
async def initialize_headers(self): | |
raise NotImplementedError | |
@abstractmethod | |
async def disable_access_token(self): | |
raise NotImplementedError |
🧰 Tools
🪛 Ruff
99-102: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 15
🧹 Outside diff range and nitpick comments (2)
streaming_providers/routes.py (1)
89-90
: Update the docstring to include the new parameter.The function signature has been updated to include
background_tasks
, but the docstring doesn't reflect this change. Please update the docstring to include information about the new parameter for better documentation.streaming_providers/offcloud/client.py (1)
34-49
: Consider adding type hints for the return value.The
_make_request
method is annotated to returndict | list
. Consider usingUnion[dict, list]
for better readability and compatibility with older Python versions.from typing import Union async def _make_request( ... ) -> Union[dict, list]: ...
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (8)
- streaming_providers/alldebrid/utils.py (1 hunks)
- streaming_providers/debridlink/utils.py (1 hunks)
- streaming_providers/offcloud/client.py (2 hunks)
- streaming_providers/parser.py (1 hunks)
- streaming_providers/pikpak/utils.py (3 hunks)
- streaming_providers/realdebrid/utils.py (1 hunks)
- streaming_providers/routes.py (6 hunks)
- streaming_providers/torbox/utils.py (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- streaming_providers/pikpak/utils.py
🧰 Additional context used
🪛 Ruff
streaming_providers/realdebrid/utils.py
135-137: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
🔇 Additional comments (39)
streaming_providers/parser.py (3)
12-20
: Approve async conversion and new parameter.The conversion of the function to async and the addition of the
file_size_callback
parameter are good improvements. These changes allow for more efficient handling of I/O-bound operations and provide flexibility in file size retrieval.
57-117
: Approve new async function for updating torrent streams metadata.The new
update_torrent_streams_metadata
function is a well-structured addition that handles both single episode and entire season updates. It's consistent with the transition to asynchronous programming and appropriately manages different scenarios for updatingTorrentStreams
metadata.
1-117
: Summary of changes and their impact.The modifications in this file significantly contribute to the overall goal of migrating streaming provider functions to async and improving performance. The conversion of
select_file_index_from_torrent
to async and the addition of the new async functionupdate_torrent_streams_metadata
enhance the module's ability to handle I/O-bound operations efficiently.These changes, combined with the improvements in error handling and the introduction of the
file_size_callback
, provide a more robust and flexible implementation. The suggested optimizations for file selection logic and season-specific updates, if implemented, would further enhance the efficiency of these functions.Overall, these changes align well with the PR objectives and represent a positive step towards improving the performance of the streaming providers module.
streaming_providers/alldebrid/utils.py (1)
10-20
:⚠️ Potential issueEnsure safe access to dictionary keys
While the conversion to async is correct, there's a potential issue with accessing dictionary keys without checks. This could lead to KeyError exceptions if the expected keys are missing.
Consider using the
.get()
method or adding explicit checks. Apply this diff to improve robustness:async def get_torrent_info(ad_client, info_hash): torrent_info = await ad_client.get_available_torrent(info_hash) - if torrent_info and torrent_info["status"] == "Ready": + if torrent_info and torrent_info.get("status") == "Ready": return torrent_info - elif torrent_info and torrent_info["statusCode"] == 7: + elif torrent_info and torrent_info.get("statusCode") == 7: await ad_client.delete_torrent(torrent_info.get("id")) raise ProviderException( "Not enough seeders available to parse magnet link", "transfer_error.mp4", ) return NoneLikely invalid or redundant comment.
streaming_providers/routes.py (5)
4-4
: LGTM: New imports support async functionality.The addition of
Annotated
andBackgroundTasks
imports aligns with the changes made to support asynchronous operations and improved type hinting.Also applies to: 12-12
117-117
: LGTM: Consistent update to kwargs dictionary.The addition of
background_tasks
to thekwargs
dictionary is consistent with the function signature update and allows for proper passing of background tasks to provider-specific functions.
195-196
: LGTM: Enhanced type hinting and background task support.The changes to the function signature improve type hinting and add support for background tasks:
- The use of
Annotated
withDepends
foruser_data
enhances type hinting, as mentioned in a previous review comment.- The addition of the
background_tasks
parameter is consistent with the shift towards asynchronous operations.These changes are appropriate and align with modern FastAPI practices.
236-236
: LGTM: Consistent use of background_tasks.The addition of
background_tasks
to theget_or_create_video_url
function call is consistent with the earlier changes and ensures proper handling of background tasks throughout the request lifecycle.
Line range hint
1-300
: Summary: Successful migration to async and performance improvements.The changes in this file consistently support the transition to asynchronous operations and improved type hinting. Key improvements include:
- Addition of
BackgroundTasks
support.- Enhanced type hinting with
Annotated
.- Consistent updates to function signatures and calls.
These changes align well with the PR objectives and should contribute to improved performance in handling streaming provider requests.
streaming_providers/debridlink/utils.py (4)
1-2
: Imports updated correctly for asynchronous functionalityThe addition of
asyncio
andOptional
imports is appropriate for the asynchronous modifications and type hints.
99-101
:⚠️ Potential issueHandle exceptions during torrent deletion in
delete_all_torrents_from_dl
When using
asyncio.gather
to delete torrents, if anydelete_torrent
call raises an exception, it might halt the entire deletion process. To ensure all torrents are attempted for deletion, handle exceptions for each call individually.Apply this diff to handle exceptions:
await asyncio.gather( - *[dl_client.delete_torrent(torrent["id"]) for torrent in torrents["value"]] + *[ + dl_client.delete_torrent(torrent["id"]) for torrent in torrents["value"] + ], + return_exceptions=True ) +for torrent, result in zip(torrents["value"], results): + if isinstance(result, Exception): + logger.error(f"Failed to delete torrent {torrent['id']}: {result}")Likely invalid or redundant comment.
65-74
:⚠️ Potential issueCheck for 'value' key in instant availability response
In
update_dl_cache_status
, accessinginstant_availability_response["value"]
without ensuring the key exists may raise aKeyError
. To prevent this, add a check to verify that the'value'
key is present.Apply this diff to add the key check:
for stream in streams: - stream.cached = bool( - stream.id in instant_availability_response["value"] - ) + if "value" in instant_availability_response: + stream.cached = stream.id in instant_availability_response["value"] + else: + stream.cached = FalseLikely invalid or redundant comment.
11-14
: Ensure all calls toget_download_link
are updatedThe function
get_download_link
has been converted to an asynchronous function. Please verify that all calls to this function elsewhere in the codebase have been updated to useawait
.Run the following script to find any non-await calls to
get_download_link
:streaming_providers/torbox/utils.py (4)
59-62
: LGTM!The function signature has been updated with proper type hints, and the internal logic remains unchanged.
65-79
: LGTM!The new asynchronous function
update_chunk_cache_status
is implemented correctly with proper error handling.
71-75
: The past review comment is still valid and applicable to the current code segment.
110-120
: LGTM!The function has been successfully converted to async with proper type hints and updated to use
await
for the asynchronous call toselect_file_index_from_torrent
.streaming_providers/offcloud/client.py (14)
38-38
: UseUnion[dict, str]
instead ofdict | str
for Python version compatibility.The type hint
dict | str
is only valid in Python 3.10 and later. To maintain compatibility with earlier versions of Python (e.g., Python 3.7, 3.8, 3.9), consider usingUnion[dict, str]
from thetyping
module.
86-93
: Handle potentialKeyError
when accessingoriginalLink
.When iterating over
available_torrents
, it's possible that some torrent dictionaries might not have theoriginalLink
key, which could raise aKeyError
. Consider usingtorrent.get("originalLink", "")
to safely access the key.
144-147
: Use the correct HTTP method for deleting resources.The
delete_torrent
method currently uses the HTTPGET
method to perform a delete operation:return await self._make_request( "GET", f"/cloud/remove/{request_id}", delete=True )Using
GET
for deletion is not RESTful and may lead to unintended side effects. Consider using the HTTPDELETE
method instead to properly represent the action.
1-3
: LGTM!The imports are organized and follow the standard conventions.
20-21
: Verify the emptyinitialize_headers
method.The
initialize_headers
method is currently empty. Confirm if this is intentional or if any initialization logic needs to be added.
23-24
: Verify the emptydisable_access_token
method.The
disable_access_token
method is currently empty. Confirm if this is intentional or if any token disabling logic needs to be added.
26-32
: LGTM!The error handling for specific HTTP status codes is implemented correctly, raising appropriate exceptions with relevant messages.
Line range hint
52-67
: LGTM!The
add_magnet_link
method is implemented correctly, handling the response data and raising appropriate exceptions when needed.
69-70
: LGTM!The
get_user_torrent_list
method is implemented correctly, making the necessary request to retrieve the torrent history.
72-76
: LGTM!The
get_torrent_info
method is implemented correctly, making the request to retrieve torrent status and returning the relevant information.
78-82
: LGTM!The
get_torrent_instant_availability
method is implemented correctly, making the request to check the cache status of the provided magnet links.
95-96
: LGTM!The
explore_folder_links
method is implemented correctly, making the request to retrieve the folder links for the given request ID.
98-105
: LGTM!The
update_file_sizes
method is implemented correctly, making concurrent HEAD requests to retrieve the file sizes and updating thefiles_data
dictionary accordingly.
107-142
: Verify the behavior whenfilename
isNone
.When
filename
isNone
, the code updates the torrent streams metadata in the background usingbackground_tasks.add_task()
. Ensure that this behavior aligns with the expected functionality and does not introduce any unintended side effects.streaming_providers/realdebrid/utils.py (8)
39-51
: LGTM!The logic for updating torrent stream metadata in the background task looks good. It correctly passes all the required parameters to the
update_torrent_streams_metadata
function.
158-175
: LGTM!The
add_new_torrent
function looks good. It correctly checks for torrent limit and duplicate torrents before adding a new one. The error handling for failed magnet link addition is also proper.
133-137
:⚠️ Potential issueUse exception chaining to preserve original exception context
In the
except ProviderException as error
block, you are raising a new exception without chaining it from the original one. To maintain the original traceback and aid in debugging, usefrom
when re-raising the exception.Apply this diff to chain exceptions properly:
except ProviderException as error: await rd_client.delete_torrent(torrent_id) raise ProviderException( f"Failed to start torrent download, {error}", "transfer_error.mp4" + ) from error
🧰 Tools
🪛 Ruff
135-137: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
178-179
: 🛠️ Refactor suggestionRemove unused
**kwargs
parameter fromupdate_rd_cache_status
functionThe
**kwargs
parameter is not used within theupdate_rd_cache_status
function and can be removed to clean up the function signature.Apply this diff to remove the unused parameter:
async def update_rd_cache_status( streams: list[TorrentStreams], user_data: UserData, user_ip: str - , **kwargs ):
201-203
: 🛠️ Refactor suggestionRemove unused
**kwargs
parameter fromfetch_downloaded_info_hashes_from_rd
functionThe
**kwargs
parameter is not used within thefetch_downloaded_info_hashes_from_rd
function and can be removed to streamline the code.Apply this diff to remove the unused parameter:
async def fetch_downloaded_info_hashes_from_rd( user_data: UserData, user_ip: str - , **kwargs ) -> list[str]:
216-216
: 🛠️ Refactor suggestionRemove unused
**kwargs
parameter fromdelete_all_watchlist_rd
functionThe
**kwargs
parameter is not utilized within thedelete_all_watchlist_rd
function and can be omitted to simplify the function definition.Apply this diff to remove the unused parameter:
async def delete_all_watchlist_rd(user_data: UserData, user_ip: str - , **kwargs ):
77-78
:⚠️ Potential issueHandle missing
mimeType
key inresponse
When checking the
mimeType
, if the key is missing from theresponse
dictionary, callingstartswith
onNone
will raise anAttributeError
. Provide a default value to prevent this error.Apply this diff to handle possible missing
mimeType
:- if not response.get("mimeType").startswith("video"): + if not response.get("mimeType", "").startswith("video"):Likely invalid or redundant comment.
53-73
: Verify file selection logicThe logic for selecting the relevant file and handling cases where the file is not selected or the number of selected files doesn't match the number of links looks correct. However, please verify the following:
- Ensure that
selected_file_index
is a valid index within the range oftorrent_info["files"]
.- Confirm that the
"selected"
and"id"
keys exist in thetorrent_info["files"]
dictionaries.- Verify that
len(selected_files)
should indeed matchlen(torrent_info["links"])
for the logic to proceed without re-downloading the torrent.Generate a script to navigate the codebase and confirm the validity of these assumptions:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 11
🧹 Outside diff range and nitpick comments (8)
streaming_providers/premiumize/client.py (2)
Line range hint
82-91
: Consider adding error handling for torrent not found.The method should handle cases where the torrent is not found in the transfer list.
async def get_torrent_info(self, torrent_id): transfer_list = await self.get_transfer_list() + if transfer_list.get("status") != "success": + raise ProviderException("Failed to get transfer list", "transfer_error.mp4") torrent_info = next( ( torrent for torrent in transfer_list["transfers"] if torrent["id"] == torrent_id ), None, ) + if torrent_info is None: + raise ProviderException(f"Torrent {torrent_id} not found", "not_found.mp4") return torrent_info
Line range hint
125-146
: Add error handling and optimize head requests.The current implementation has several potential issues:
- No timeout on head requests which could lead to hanging
- No error handling for malformed Content-Disposition headers
- Sequential head requests could be slow for many torrents
Consider implementing these improvements:
async def get_available_torrent(self, info_hash: str) -> dict[str, Any] | None: torrent_list_response = await self.get_transfer_list() if torrent_list_response.get("status") != "success": if torrent_list_response.get("message") == "Not logged in.": raise ProviderException( "Premiumize is not logged in.", "invalid_token.mp4" ) raise ProviderException( "Failed to get torrent info from Premiumize", "transfer_error.mp4" ) available_torrents = torrent_list_response["transfers"] for torrent in available_torrents: src = torrent.get("src") + if not src: + continue + try: + response = await self.client.head(src, timeout=5.0) + except Exception as e: + continue if response.status_code != 200: continue magnet_link_content = response.headers.get("Content-Disposition") + if not magnet_link_content: + continue if info_hash in magnet_link_content: return torrentstreaming_providers/offcloud/client.py (1)
112-147
: Add method documentationThis complex method would benefit from docstring documentation explaining its parameters and return value.
Add docstring documentation:
async def create_download_link( self, request_id: str, torrent_info: dict, stream: TorrentStreams, filename: Optional[str], season: Optional[int], episode: Optional[int], background_tasks, ) -> str: """Create a download link for the specified torrent file. Args: request_id: The OffCloud request ID torrent_info: Dictionary containing torrent information stream: TorrentStreams object for metadata update filename: Optional filename to match season: Optional season number for TV shows episode: Optional episode number for TV shows background_tasks: FastAPI background tasks object Returns: str: Direct download URL for the selected file """streaming_providers/debrid_client.py (2)
1-19
: LGTM! Consider adding type hints for class variables.The async client initialization and imports look good. The timeout setting is well-documented with respect to Stremio's constraints.
Consider adding type hints for class variables in
__init__
:def __init__(self, token: Optional[str] = None): - self.token = token - self.is_private_token = False - self.headers = {} + self.token: Optional[str] = token + self.is_private_token: bool = False + self.headers: dict = {}
33-55
: LGTM! Consider adding return type annotation.The async request handling implementation looks good, but could benefit from a more specific return type annotation.
Add a more specific return type using TypeAlias:
+from typing import TypeAlias + +JsonResponse: TypeAlias = dict[str, any] | list[any] - ) -> dict | list: + ) -> JsonResponse:streaming_providers/debridlink/utils.py (1)
16-23
: Add type annotation forbackground_tasks
parameterTo enhance code readability and facilitate static type checking, consider adding a type annotation for the
background_tasks
parameter inget_download_link
. Sincebackground_tasks
is an instance ofBackgroundTasks
, you can annotate it accordingly.Apply this diff to add the type annotation:
async def get_download_link( torrent_info: dict, stream: TorrentStreams, - background_tasks, + background_tasks: BackgroundTasks, filename: Optional[str], file_index: Optional[int], episode: Optional[int], season: Optional[int], ) -> str:streaming_providers/debridlink/client.py (2)
Line range hint
66-77
: Add return type annotation toget_token
Consider adding a return type annotation to the
get_token
method for improved code clarity and type checking. Example:async def get_token(self, client_id, device_code) -> dict[str, Any]:
Line range hint
78-88
: Add return type annotation torefresh_token
Including a return type annotation for
refresh_token
enhances code maintainability and type checking. Example:async def refresh_token(self, client_id, refresh_token) -> dict[str, Any]:
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (9)
- streaming_providers/alldebrid/utils.py (1 hunks)
- streaming_providers/debrid_client.py (1 hunks)
- streaming_providers/debridlink/client.py (7 hunks)
- streaming_providers/debridlink/utils.py (1 hunks)
- streaming_providers/offcloud/client.py (2 hunks)
- streaming_providers/premiumize/client.py (6 hunks)
- streaming_providers/premiumize/utils.py (1 hunks)
- streaming_providers/realdebrid/api.py (1 hunks)
- streaming_providers/realdebrid/client.py (10 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- streaming_providers/realdebrid/api.py
🧰 Additional context used
🪛 Ruff
streaming_providers/debrid_client.py
27-30: Use
contextlib.suppress(ProviderException)
instead oftry
-except
-pass
Replace with
contextlib.suppress(ProviderException)
(SIM105)
107-110: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
130-131: Use a single
if
statement instead of nestedif
statementsCombine
if
statements usingand
(SIM102)
streaming_providers/premiumize/utils.py
81-81: Dictionary comprehension uses static key:
"files"
(B035)
streaming_providers/realdebrid/client.py
2-2:
typing.override
imported but unusedRemove unused import
(F401)
2-2:
typing.overload
imported but unusedRemove unused import
(F401)
🔇 Additional comments (32)
streaming_providers/premiumize/client.py (5)
20-21
: Empty async method needs review.The previous review comment about removing the
async
keyword from_handle_service_specific_errors
is still valid, as the method contains no async operations.
23-38
: LGTM! Well-structured async implementation.The method properly implements async/await patterns and includes comprehensive type hints. The token handling logic is clear and appropriate.
40-46
: Consider improving error handling and async implementation.The previous review comment about removing the unnecessary
async
keyword is still valid.Consider making the error handling more explicit:
async def initialize_headers(self): if self.token: access_token = self.decode_token_str(self.token) if access_token: self.headers = {"Authorization": f"Bearer {access_token}"} else: + # Token exists but couldn't be decoded, assume it's a private API key self.is_private_token = True
122-123
: Empty async method needs implementation.The previous review comment about removing the unnecessary
async
keyword from the emptydisable_access_token
method is still valid.
147-148
: LGTM! Clean async implementation.The method correctly implements the async pattern for account info retrieval.
streaming_providers/offcloud/client.py (6)
26-32
: LGTM! Well-structured error handlingThe error handling implementation is clear and provides good user feedback with appropriate status code handling.
34-49
: Use appropriate HTTP method for delete operationsThe delete operation should use the HTTP DELETE method instead of GET to follow RESTful principles.
Line range hint
52-81
: LGTM! Well-implemented magnet link handlingThe async implementation is clean with proper error handling and clear response processing.
84-93
: Add safe dictionary accessThe access to
originalLink
could raise KeyError if the key doesn't exist.
98-110
: Add safe Content-Length parsingThe conversion of Content-Length header to integer needs safer handling.
Line range hint
1-152
: Overall implementation looks good!The conversion to async/await is well-implemented with proper error handling, concurrency patterns, and good separation of concerns. The code structure is clean and maintainable.
streaming_providers/debrid_client.py (3)
21-31
: LGTM! Async context manager implementation is correct.The async context manager methods are properly implemented for resource management.
🧰 Tools
🪛 Ruff
27-30: Use
contextlib.suppress(ProviderException)
instead oftry
-except
-pass
Replace with
contextlib.suppress(ProviderException)
(SIM105)
56-91
: LGTM! Error handling methods properly converted to async.The error handling methods are properly converted to async and maintain good separation of concerns.
93-142
: LGTM! Methods properly converted to async with good abstraction.The abstract methods and wait_for_status implementation are correctly converted to async operations.
🧰 Tools
🪛 Ruff
107-110: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
130-131: Use a single
if
statement instead of nestedif
statementsCombine
if
statements usingand
(SIM102)
streaming_providers/debridlink/utils.py (4)
1-2
: LGTM!The imports of
asyncio
andOptional
are appropriate for asynchronous operations and optional type hints.
4-4
: LGTM!Importing
BackgroundTasks
fromfastapi
is necessary for managing background tasks.
10-13
: LGTM!The imported functions
select_file_index_from_torrent
andupdate_torrent_streams_metadata
are properly utilized in the updated asynchronous functions.
65-68
: LGTM!Proper error handling is implemented for cases where
torrent_id
is not obtained after adding the magnet link. This improves robustness.streaming_providers/alldebrid/utils.py (7)
16-23
: Ensure safe access to dictionary keys to preventKeyError
In the
get_torrent_info
function, accessingtorrent_info["status"]
,torrent_info["statusCode"]
, andtorrent_info.get("id")
assumes these keys always exist. To prevent potentialKeyError
, consider using the.get()
method or adding checks to ensure these keys are present before accessing them.
29-30
: Handle potentialKeyError
in nested dictionary accessAccessing
response_data["data"]["magnets"][0]["id"]
assumes that the keys"data"
,"magnets"
exist and that the list is not empty. This could raise aKeyError
orIndexError
if the response is not as expected. Consider adding checks or using the.get()
method to ensure safe access.
68-70
: Ensure safe access to nested dictionary keysIn
wait_for_download_and_get_link
, accessingtorrent_info["links"][file_index]["link"]
andresponse["data"]["link"]
assumes that these keys and indices exist, which may lead toKeyError
orIndexError
if they are missing. To enhance robustness, consider adding checks or using the.get()
method with default values.
88-107
: Add exception handling to improve robustness inget_video_url_from_alldebrid
The function does not have exception handling for the asynchronous calls to
get_torrent_info
,add_new_torrent
, andwait_for_download_and_get_link
. Adding atry-except
block will help manage potential errors and enhance the function's robustness.
146-148
: Ensure safe access to dictionary keys in list comprehensionAccessing
torrent["hash"]
assumes that the"hash"
key exists in each torrent dictionary. This could lead to aKeyError
if the key is missing. Consider using the.get()
method or adding a condition to check for the key.
159-164
: Handle potential errors and ensure safe dictionary accessIn
delete_all_torrents_from_ad
, accessingtorrents["data"]["magnets"]
assumes that the keys"data"
and"magnets"
exist. Additionally, errors during torrent deletion are not handled, which may lead to unhandled exceptions. Consider adding checks for key existence and handling exceptions during the deletion process.
115-129
:⚠️ Potential issueAvoid suppressing exceptions without handling them
In the
update_ad_cache_status
function, theexcept ProviderException: pass
block suppresses exceptions without any handling or logging. This can make debugging difficult as errors are silently ignored. Consider handling the exception appropriately or logging the error for better traceability.Apply this diff to handle the exception:
except ProviderException as e: - pass + # Log the exception or handle it appropriately + print(f"ProviderException in update_ad_cache_status: {e}")Likely invalid or redundant comment.
streaming_providers/realdebrid/client.py (4)
41-55
: Asynchronous_make_request
method updated appropriatelyThe
_make_request
method has been successfully converted to an asynchronous method, and the addition of thejson
parameter is correctly handled. This ensures non-blocking I/O operations and aligns with the asynchronous design.
Line range hint
58-75
: Proper asynchronous initialization ininitialize_headers
Converting
initialize_headers
to an asynchronous method is appropriate since it now awaits theget_token
method. The logic for handling different token types is correctly implemented.
91-99
: Async methodsget_device_code
andget_token
are correctly implementedThe methods have been correctly converted to asynchronous functions, and the API calls are properly awaited. This enhances the responsiveness when interacting with the Real-Debrid API.
Line range hint
183-201
: Robust error handling increate_download_link
The method
create_download_link
appropriately handles different response scenarios, including checking fordownload
in the response and raising exceptions based onerror_code
. The use of asynchronous calls is consistent and correct.streaming_providers/premiumize/utils.py (3)
17-21
: Clarify the exception message increate_or_get_folder_id
.The error message "Folder already created in meanwhile" is unclear and may confuse users. Consider rephrasing it to better reflect the actual issue, such as "Failed to create folder."
134-141
: Add confirmation before deleting all folders to prevent data loss.The
delete_all_torrents_from_pm
function deletes all folders from the user's Premiumize account without confirmation, which could lead to accidental data loss. Consider adding a confirmation step or restricting deletion to specific folders.
25-35
: Ensure all calls toget_video_url_from_premiumize
are updated to asynchronous.The function
get_video_url_from_premiumize
has been converted to an asynchronous function. Verify that all calls to this function in the codebase have been updated to useawait
.Run the following script to find all calls to
get_video_url_from_premiumize
that may need updating:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (1)
- streaming_providers/pikpak/utils.py (7 hunks)
🧰 Additional context used
🪛 Ruff
streaming_providers/pikpak/utils.py
176-176: Yoda condition detected
Rewrite as
str(error) == "Invalid username or password"
(SIM300)
177-179: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
181-184: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
186-189: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
🔇 Additional comments (4)
streaming_providers/pikpak/utils.py (4)
139-140
: LGTM! Good separation of concerns.The refactoring improves the code by delegating file selection logic to the specialized
select_file_index_from_torrent
function.
Line range hint
145-195
: Great implementation of async context manager!The use of
@asynccontextmanager
ensures proper cleanup of HTTP client resources and provides a cleaner way to manage PikPak API connections.🧰 Tools
🪛 Ruff
176-176: Yoda condition detected
Rewrite as
str(error) == "Invalid username or password"
(SIM300)
177-179: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
181-184: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
186-189: Within an
except
clause, raise exceptions withraise ... from err
orraise ... from None
to distinguish them from errors in exception handling(B904)
204-205
: Good security practice: Reduced token expiration time.Reducing the token cache duration from 7 days to 5 minutes significantly improves security by minimizing the window of opportunity for token misuse.
362-395
: Well-structured async implementation with proper resource management.The consistent use of the async context manager across these functions ensures proper resource cleanup. The addition of the
_parse_info_hash_from_magnet
helper function improves code readability.
@coderabbitai close all current reviews and run a fresh review on this PR |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 27
🧹 Outside diff range and nitpick comments (12)
streaming_providers/seedr/api.py (2)
13-15
: Consider adding error handlingWhile the async implementation is correct, consider adding try-except blocks to handle potential connection errors or API failures gracefully.
@router.get("/get-device-code") async def get_device_code(): - async with Login() as seedr_login: - device_code = await seedr_login.get_device_code() - return JSONResponse(content=device_code, headers=const.NO_CACHE_HEADERS) + try: + async with Login() as seedr_login: + device_code = await seedr_login.get_device_code() + return JSONResponse(content=device_code, headers=const.NO_CACHE_HEADERS) + except Exception as e: + return JSONResponse( + status_code=500, + content={"error": f"Failed to get device code: {str(e)}"}, + headers=const.NO_CACHE_HEADERS + )
20-27
: Add error handling and improve response structureThe async implementation is correct, but consider adding error handling and improving the response structure for consistency.
@router.post("/authorize") async def authorize(data: AuthorizeData): - async with Login() as seedr_login: - response = await seedr_login.authorize(data.device_code) - if "access_token" in response: - return JSONResponse( - content={"token": seedr_login.token}, headers=const.NO_CACHE_HEADERS - ) - else: - return JSONResponse(content=response, headers=const.NO_CACHE_HEADERS) + try: + async with Login() as seedr_login: + response = await seedr_login.authorize(data.device_code) + if "access_token" in response: + return JSONResponse( + content={"status": "success", "token": seedr_login.token}, + headers=const.NO_CACHE_HEADERS + ) + return JSONResponse( + content={"status": "pending", "details": response}, + headers=const.NO_CACHE_HEADERS + ) + except Exception as e: + return JSONResponse( + status_code=500, + content={"status": "error", "message": f"Authorization failed: {str(e)}"}, + headers=const.NO_CACHE_HEADERS + )streaming_providers/torbox/client.py (1)
Line range hint
83-95
: Security: Move token from URL parameter to Authorization headerThe token is currently passed as a URL parameter, which could expose it in server logs and browser history. Since the Authorization header is already being used in other methods, it would be more secure to rely on that instead.
Apply this diff to remove the token from URL parameters:
async def create_download_link(self, torrent_id, filename): response = await self._make_request( "GET", "/torrents/requestdl", - params={"token": self.token, "torrent_id": torrent_id, "file_id": filename}, + params={"torrent_id": torrent_id, "file_id": filename}, is_expected_to_fail=True, )streaming_providers/premiumize/client.py (2)
Line range hint
125-146
: Improve error handling in torrent availability check.The current implementation has several potential issues:
- No timeout for HEAD requests
- No handling of connection errors
- No null check for magnet_link_content
Consider applying this improvement:
async def get_available_torrent(self, info_hash: str) -> dict[str, Any] | None: torrent_list_response = await self.get_transfer_list() if torrent_list_response.get("status") != "success": if torrent_list_response.get("message") == "Not logged in.": raise ProviderException( "Premiumize is not logged in.", "invalid_token.mp4" ) raise ProviderException( "Failed to get torrent info from Premiumize", "transfer_error.mp4" ) available_torrents = torrent_list_response["transfers"] for torrent in available_torrents: src = torrent["src"] - async with self.session.head(src) as response: + try: + async with self.session.head(src, timeout=10) as response: + if response.status != 200: + continue + + magnet_link_content = response.headers.get("Content-Disposition") + if magnet_link_content and info_hash in magnet_link_content: + return torrent + except Exception: + continue - if response.status != 200: - continue - - magnet_link_content = response.headers.get("Content-Disposition") - if info_hash in magnet_link_content: - return torrent
Line range hint
9-148
: Consider adding docstrings to the class and methods.While the implementation is solid, the code would benefit from documentation explaining:
- The purpose of the class
- Method parameters and return types
- Expected exceptions
- Usage examples
streaming_providers/debridlink/client.py (3)
Line range hint
116-124
: Standardize error handling in get_torrent_infoThe method implements custom error handling while other methods rely on
_handle_service_specific_errors
. This inconsistency makes the code harder to maintain.Consider using the standard error handling approach:
async def get_torrent_info(self, torrent_id) -> dict[str, Any]: response = await self._make_request( "GET", f"{self.BASE_URL}/seedbox/list", params={"ids": torrent_id} ) - if response.get("value"): - return response.get("value")[0] - raise ProviderException( - "Failed to get torrent info from Debrid-Link", "transfer_error.mp4" - ) + return response.get("value", [{}])[0]
Line range hint
149-160
: Improve get_available_torrent implementationThe method can be simplified and made more consistent with the rest of the class:
- Use standard error handling through
_handle_service_specific_errors
- Use list comprehension for better readability
Consider this implementation:
async def get_available_torrent(self, info_hash: str) -> Optional[dict[str, Any]]: torrent_list_response = await self.get_user_torrent_list() - if "error" in torrent_list_response: - raise ProviderException( - "Failed to get torrent info from Debrid-Link", "transfer_error.mp4" - ) - available_torrents = torrent_list_response["value"] - for torrent in available_torrents: - if torrent["hashString"] == info_hash: - return torrent - return None + return next( + (torrent for torrent in available_torrents if torrent["hashString"] == info_hash), + None + )
Line range hint
1-160
: Consider standardizing error handling patterns across the classThe class currently uses multiple error handling patterns:
- Some methods use
_handle_service_specific_errors
through_make_request
- Some methods implement custom error handling
- Some methods mix both approaches
This inconsistency makes the code harder to maintain and debug.
Consider:
- Documenting when to use each error handling approach
- Standardizing error handling across similar methods
- Creating a common error handling strategy for the entire class
🧰 Tools
🪛 Gitleaks
10-10: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
streaming_providers/debrid_client.py (1)
18-19
: Consider making timeout configurable.The timeout is hardcoded to 15s. Consider making it configurable through the constructor to allow different timeout values for different use cases.
- def __init__(self, token: Optional[str] = None): + def __init__(self, token: Optional[str] = None, timeout: int = 15): self.token = token self.is_private_token = False self.headers: Dict[str, str] = {} self._session: Optional[aiohttp.ClientSession] = None - self._timeout = ClientTimeout(total=15) # Stremio timeout is 20s + self._timeout = ClientTimeout(total=timeout) # Stremio timeout is 20sstreaming_providers/seedr/utils.py (1)
138-140
: Log unexpected transfer results for better error diagnosticsWhen an unexpected
transfer["result"]
is encountered, the current code raises a genericProviderException
. To facilitate debugging and error tracking, consider logging the unexpectedtransfer["result"]
value before raising the exception.Add logging before raising the exception:
if transfer["result"] in error_messages: msg, video = error_messages[transfer["result"]] raise ProviderException(msg, video) + else: + logging.error(f"Unexpected transfer result: {transfer['result']}") raise ProviderException( "Error transferring magnet link to Seedr", "transfer_error.mp4" )streaming_providers/qbittorrent/utils.py (2)
209-216
: Remove or explain commented-out code inset_qbittorrent_preferences
.The
else
block is commented out. If it's not needed, please remove it to improve code clarity. If you plan to use it later, consider adding a comment explaining why it's commented out.
136-136
: Clarify the timeout value inClientSession
.The
timeout=18
is set, but it's unclear if this is intentional. Verify if 18 seconds is appropriate for your use case or consider defining it as a constant with a descriptive name.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (12)
streaming_providers/alldebrid/client.py
(5 hunks)streaming_providers/debrid_client.py
(1 hunks)streaming_providers/debridlink/client.py
(7 hunks)streaming_providers/offcloud/client.py
(2 hunks)streaming_providers/offcloud/utils.py
(1 hunks)streaming_providers/premiumize/client.py
(6 hunks)streaming_providers/premiumize/utils.py
(1 hunks)streaming_providers/qbittorrent/utils.py
(7 hunks)streaming_providers/realdebrid/client.py
(9 hunks)streaming_providers/seedr/api.py
(2 hunks)streaming_providers/seedr/utils.py
(1 hunks)streaming_providers/torbox/client.py
(4 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- streaming_providers/offcloud/utils.py
🧰 Additional context used
🪛 Ruff
streaming_providers/debrid_client.py
36-39: Use contextlib.suppress(ProviderException)
instead of try
-except
-pass
Replace with contextlib.suppress(ProviderException)
(SIM105)
91-93: Within an except
clause, raise exceptions with raise ... from err
or raise ... from None
to distinguish them from errors in exception handling
(B904)
104-104: Within an except
clause, raise exceptions with raise ... from err
or raise ... from None
to distinguish them from errors in exception handling
(B904)
107-110: Within an except
clause, raise exceptions with raise ... from err
or raise ... from None
to distinguish them from errors in exception handling
(B904)
141-144: Within an except
clause, raise exceptions with raise ... from err
or raise ... from None
to distinguish them from errors in exception handling
(B904)
164-165: Use a single if
statement instead of nested if
statements
Combine if
statements using and
(SIM102)
streaming_providers/premiumize/utils.py
81-81: Dictionary comprehension uses static key: "files"
(B035)
streaming_providers/realdebrid/client.py
2-2: typing.override
imported but unused
Remove unused import
(F401)
2-2: typing.overload
imported but unused
Remove unused import
(F401)
streaming_providers/seedr/utils.py
150-153: Combine if
branches using logical or
operator
Combine if
branches
(SIM114)
🔇 Additional comments (26)
streaming_providers/seedr/api.py (2)
3-3
: LGTM: Appropriate async client import
The migration to aioseedrcc.Login
is correct for enabling async operations.
Line range hint 1-27
: Verify consistent async patterns across related files
The async migration in this file looks good, but let's verify the consistency of async patterns across related files.
✅ Verification successful
Based on the results, I can now generate the final response:
Async implementation is consistent across the streaming providers
The verification shows:
- All streaming provider modules consistently use async/await patterns
- Proper error handling with try/except blocks is implemented across providers
- Common exception handling patterns using
ProviderException
are used consistently - Async HTTP clients (aiohttp, httpx) are used appropriately with proper error handling
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for consistent async patterns in related files
# Check for async/await usage in related files
echo "Checking async pattern consistency across streaming providers:"
rg -l "async def" "streaming_providers/"
# Look for potential missed async conversions
echo -e "\nChecking for potential missed async conversions:"
rg "def (?!.*async)" "streaming_providers/" | grep -v "test_" | grep "def"
# Check for consistent error handling patterns
echo -e "\nChecking error handling patterns:"
rg "try:|except" "streaming_providers/"
Length of output: 10344
streaming_providers/premiumize/utils.py (4)
11-23
: LGTM with a note on error handling
The async conversion looks good with proper await usage and safe dictionary access. However, there's an existing comment about improving the error message clarity that should be addressed.
115-131
: LGTM! Well-structured async implementation
The function has good error handling, proper validation of info hash lengths (40 or 32), and safely handles missing content.
134-141
: LGTM with a note on safety
The async implementation with asyncio.gather
for parallel deletion is efficient. However, there's an existing comment about adding safety checks before deletion that should be addressed.
80-84
:
Fix dictionary comprehension to handle multiple video files
The current implementation will only keep the last video file due to using a static key in the dictionary comprehension. This could cause issues when multiple video files are present.
Apply this fix to properly collect all video files:
-torrent_info = {
- "files": file_data
- for file_data in torrent_folder_data["content"]
- if "video" in file_data["mime_type"]
-}
+torrent_info = {
+ "files": [
+ file_data
+ for file_data in torrent_folder_data["content"]
+ if "video" in file_data["mime_type"]
+ ]
+}
Likely invalid or redundant comment.
🧰 Tools
🪛 Ruff
81-81: Dictionary comprehension uses static key: "files"
(B035)
streaming_providers/premiumize/client.py (5)
1-1
: LGTM! Good addition of type hints.
The addition of typing imports enhances code clarity and type safety.
19-21
: Skip commenting on empty async method.
This was already addressed in past reviews.
40-46
: Skip commenting on unnecessary async.
This was already addressed in past reviews.
147-148
: LGTM! Clean implementation of account info endpoint.
The method correctly implements the async pattern and delegates to _make_request.
Line range hint 52-111
: LGTM! Clean async conversion of API methods.
The methods have been correctly converted to async and properly delegate to _make_request.
streaming_providers/alldebrid/client.py (10)
1-1
: Appropriate addition of Any
and Optional
imports
The inclusion of Any
and Optional
from the typing
module enhances type annotations and improves code clarity.
11-11
: Correct use of Optional[str]
in constructor
The user_ip
parameter is appropriately typed as Optional[str] = None
, enhancing type safety and accommodating cases where user_ip
may not be provided.
15-16
: Confirm necessity of async
in initialize_headers
The method initialize_headers
does not contain any await
expressions. If it's not required to be asynchronous due to inheritance from DebridClient
, consider removing the async
keyword to avoid unnecessary overhead.
Please verify whether initialize_headers
needs to be asynchronous based on the parent class implementation.
18-19
: Review the use of async
in disable_access_token
The disable_access_token
method is asynchronous but contains no await
expressions and only a pass
statement. Ensure that making it async is necessary, possibly due to overriding an asynchronous method in the parent class.
Please confirm if the parent class requires this method to be asynchronous.
21-22
: Assess the async
designation for _handle_service_specific_errors
The method _handle_service_specific_errors
is marked as asynchronous but does not perform any asynchronous operations. If not mandated by the base class DebridClient
, consider removing the async
keyword.
Check if the base class expects this method to be asynchronous.
24-41
: Proper asynchronous implementation of _make_request
The _make_request
method is correctly converted to an asynchronous function and appropriately uses await
when calling the parent's _make_request
method. The addition of parameters like json
and typing enhancements improve flexibility and code readability.
51-53
: Consistent and clear error messages in exceptions
The customized error messages in ProviderException
provide specific feedback, enhancing debugging and user experience. Including identifiers like "invalid_token.mp4"
aids in error handling mechanisms.
Also applies to: 60-62, 64-66, 68-70
76-79
: Efficient asynchronous implementation in add_magnet_link
The method add_magnet_link
is appropriately converted to async, and the use of await
with _make_request
ensures non-blocking I/O operations. The request parameters are correctly structured.
98-106
: Efficient retrieval in get_available_torrent
The method correctly uses asynchronous calls and iterates over the user's torrent list to find a matching info_hash
. Error validation is already included, ensuring robustness.
Line range hint 108-120
: Proper error handling in create_download_link
The method handles both successful and failed responses appropriately, raising a ProviderException
when necessary. The use of is_expected_to_fail
aligns with the potential for the request to fail without raising an exception immediately.
streaming_providers/realdebrid/client.py (3)
2-2
: Remove unused imports override
and overload
The imports override
and overload
from the typing
module are not used in this file. Removing them will clean up the code and eliminate warnings from static analysis tools.
🧰 Tools
🪛 Ruff
2-2: typing.override
imported but unused
Remove unused import
(F401)
2-2: typing.overload
imported but unused
Remove unused import
(F401)
172-177
: Optimize Search in get_available_torrent
The method get_available_torrent
fetches the entire torrent list and iterates over it to find the matching info_hash
. If the API provides an endpoint to retrieve a torrent by its hash directly, consider using it to improve efficiency.
21-34
:
Ensure Python 3.10+ compatibility due to use of match
statements
The use of match
statements in the _handle_service_specific_errors
method requires Python 3.10 or newer. Please verify that the project is intended to run on Python 3.10 or higher to avoid syntax errors.
To confirm the project's Python version compatibility, you can check for the required Python version specification in the project configuration files. Here is a shell script to assist with the verification:
streaming_providers/qbittorrent/utils.py (2)
277-279
: Ensure new parameters are handled correctly in get_video_url_from_qbittorrent
.
The addition of indexer_type
, filename
, and episode
parameters requires updates wherever this function is called. Verify that all calls to this function pass the new arguments appropriately.
226-228
: Update retrieve_or_download_file
calls with new parameters.
Similarly, ensure that all calls to retrieve_or_download_file
now include indexer_type
, filename
, and episode
to prevent runtime errors.
async def _make_request( | ||
self, | ||
method: str, | ||
url: str, | ||
data: Optional[dict | str] = None, | ||
json: Optional[dict] = None, | ||
params: Optional[dict] = None, | ||
is_return_none: bool = False, | ||
is_expected_to_fail: bool = False, | ||
retry_count: int = 0, | ||
) -> dict: | ||
params = params or {} | ||
url = self.BASE_URL + url | ||
return super()._make_request( | ||
method, url, data, params, is_return_none, is_expected_to_fail | ||
return await super()._make_request( | ||
method, url, data, json, params, is_return_none, is_expected_to_fail | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove unused retry_count parameter
The retry_count
parameter is defined in the method signature but is never used in the method implementation or passed to the parent class's _make_request
call.
Apply this diff to remove the unused parameter:
async def _make_request(
self,
method: str,
url: str,
data: Optional[dict | str] = None,
json: Optional[dict] = None,
params: Optional[dict] = None,
is_return_none: bool = False,
- is_expected_to_fail: bool = False,
- retry_count: int = 0,
+ is_expected_to_fail: bool = False
) -> dict:
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async def _make_request( | |
self, | |
method: str, | |
url: str, | |
data: Optional[dict | str] = None, | |
json: Optional[dict] = None, | |
params: Optional[dict] = None, | |
is_return_none: bool = False, | |
is_expected_to_fail: bool = False, | |
retry_count: int = 0, | |
) -> dict: | |
params = params or {} | |
url = self.BASE_URL + url | |
return super()._make_request( | |
method, url, data, params, is_return_none, is_expected_to_fail | |
return await super()._make_request( | |
method, url, data, json, params, is_return_none, is_expected_to_fail | |
) | |
async def _make_request( | |
self, | |
method: str, | |
url: str, | |
data: Optional[dict | str] = None, | |
json: Optional[dict] = None, | |
params: Optional[dict] = None, | |
is_return_none: bool = False, | |
is_expected_to_fail: bool = False | |
) -> dict: | |
params = params or {} | |
url = self.BASE_URL + url | |
return await super()._make_request( | |
method, url, data, json, params, is_return_none, is_expected_to_fail | |
) |
async def get_available_torrent(self, info_hash) -> dict[str, Any] | None: | ||
response = await self.get_user_torrent_list() | ||
torrent_list = response.get("data", []) | ||
for torrent in torrent_list: | ||
if torrent.get("hash", "") == info_hash: | ||
return torrent | ||
return {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix return type inconsistency in get_available_torrent
The method's return type hint indicates dict[str, Any] | None
, but it returns an empty dict when no torrent is found instead of None.
Apply this diff to align the implementation with the type hint:
async def get_available_torrent(self, info_hash) -> dict[str, Any] | None:
response = await self.get_user_torrent_list()
torrent_list = response.get("data", [])
for torrent in torrent_list:
if torrent.get("hash", "") == info_hash:
return torrent
- return {}
+ return None
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async def get_available_torrent(self, info_hash) -> dict[str, Any] | None: | |
response = await self.get_user_torrent_list() | |
torrent_list = response.get("data", []) | |
for torrent in torrent_list: | |
if torrent.get("hash", "") == info_hash: | |
return torrent | |
return {} | |
async def get_available_torrent(self, info_hash) -> dict[str, Any] | None: | |
response = await self.get_user_torrent_list() | |
torrent_list = response.get("data", []) | |
for torrent in torrent_list: | |
if torrent.get("hash", "") == info_hash: | |
return torrent | |
return None |
async def get_video_url_from_premiumize( | ||
info_hash: str, | ||
magnet_link: str, | ||
user_data: UserData, | ||
torrent_name: str, | ||
filename: str, | ||
filename: Optional[str], | ||
episode: Optional[int], | ||
max_retries=5, | ||
retry_interval=5, | ||
**kwargs, | ||
) -> str: | ||
pm_client = Premiumize(token=user_data.streaming_provider.token) | ||
|
||
# Check if the torrent already exists | ||
torrent_info = pm_client.get_available_torrent(info_hash, torrent_name) | ||
if torrent_info: | ||
torrent_id = torrent_info.get("id") | ||
if torrent_info["status"] == "error": | ||
pm_client.delete_torrent(torrent_id) | ||
raise ProviderException( | ||
"Not enough seeders available for parse magnet link", | ||
"transfer_error.mp4", | ||
) | ||
else: | ||
# If torrent doesn't exist, add it | ||
folder_id = create_or_get_folder_id(pm_client, info_hash) | ||
response_data = pm_client.add_magnet_link(magnet_link, folder_id) | ||
if "id" not in response_data: | ||
raise ProviderException( | ||
"Failed to add magnet link to Real-Debrid", "transfer_error.mp4" | ||
) | ||
torrent_id = response_data["id"] | ||
|
||
# Wait for file selection and then start torrent download | ||
torrent_info = pm_client.wait_for_status( | ||
torrent_id, "finished", max_retries, retry_interval | ||
) | ||
return get_stream_link(pm_client, torrent_info, filename, info_hash) | ||
async with Premiumize(token=user_data.streaming_provider.token) as pm_client: | ||
# Check if the torrent already exists | ||
torrent_info = await pm_client.get_available_torrent(info_hash) | ||
if torrent_info: | ||
torrent_id = torrent_info.get("id") | ||
if torrent_info["status"] == "error": | ||
await pm_client.delete_torrent(torrent_id) | ||
raise ProviderException( | ||
"Not enough seeders available for parse magnet link", | ||
"transfer_error.mp4", | ||
) | ||
else: | ||
# If torrent doesn't exist, add it | ||
folder_id = await create_or_get_folder_id(pm_client, info_hash) | ||
response_data = await pm_client.add_magnet_link(magnet_link, folder_id) | ||
if "id" not in response_data: | ||
raise ProviderException( | ||
"Failed to add magnet link to Premiumize", "transfer_error.mp4" | ||
) | ||
torrent_id = response_data["id"] | ||
|
||
# Wait for file selection and then start torrent download | ||
torrent_info = await pm_client.wait_for_status( | ||
torrent_id, "finished", max_retries, retry_interval | ||
) | ||
return await get_stream_link( | ||
pm_client, torrent_info, filename, info_hash, episode | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider extracting retry configuration
The async implementation looks great with proper error handling and status checking. However, the retry configuration could be moved to settings for better maintainability.
Consider extracting the retry configuration:
+from config import settings
+
async def get_video_url_from_premiumize(
info_hash: str,
magnet_link: str,
user_data: UserData,
filename: Optional[str],
episode: Optional[int],
- max_retries=5,
- retry_interval=5,
+ max_retries=settings.PREMIUMIZE_MAX_RETRIES,
+ retry_interval=settings.PREMIUMIZE_RETRY_INTERVAL,
**kwargs,
) -> str:
Committable suggestion skipped: line range outside the PR's diff.
async def update_pm_cache_status( | ||
streams: list[TorrentStreams], user_data: UserData, **kwargs | ||
): | ||
"""Updates the cache status of streams based on Premiumize's instant availability.""" | ||
|
||
try: | ||
pm_client = Premiumize(token=user_data.streaming_provider.token) | ||
instant_availability_data = pm_client.get_torrent_instant_availability( | ||
[stream.id for stream in streams] | ||
) | ||
for stream, cached_status in zip( | ||
streams, instant_availability_data.get("response") | ||
): | ||
stream.cached = cached_status | ||
async with Premiumize(token=user_data.streaming_provider.token) as pm_client: | ||
instant_availability_data = ( | ||
await pm_client.get_torrent_instant_availability( | ||
[stream.id for stream in streams] | ||
) | ||
) | ||
for stream, cached_status in zip( | ||
streams, instant_availability_data.get("response") | ||
): | ||
stream.cached = cached_status | ||
|
||
except ProviderException: | ||
pass | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Improve error handling and zip safety
While the async implementation is good, consider these improvements:
- Add logging for caught exceptions instead of silently passing
- Use
strict=True
with zip to ensure equal length iterations
Consider these improvements:
+import logging
+
async def update_pm_cache_status(
streams: list[TorrentStreams], user_data: UserData, **kwargs
):
try:
async with Premiumize(token=user_data.streaming_provider.token) as pm_client:
instant_availability_data = (
await pm_client.get_torrent_instant_availability(
[stream.id for stream in streams]
)
)
for stream, cached_status in zip(
streams, instant_availability_data.get("response"),
+ strict=True
):
stream.cached = cached_status
except ProviderException:
- pass
+ logging.error("Failed to update PM cache status", exc_info=True)
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async def update_pm_cache_status( | |
streams: list[TorrentStreams], user_data: UserData, **kwargs | |
): | |
"""Updates the cache status of streams based on Premiumize's instant availability.""" | |
try: | |
pm_client = Premiumize(token=user_data.streaming_provider.token) | |
instant_availability_data = pm_client.get_torrent_instant_availability( | |
[stream.id for stream in streams] | |
) | |
for stream, cached_status in zip( | |
streams, instant_availability_data.get("response") | |
): | |
stream.cached = cached_status | |
async with Premiumize(token=user_data.streaming_provider.token) as pm_client: | |
instant_availability_data = ( | |
await pm_client.get_torrent_instant_availability( | |
[stream.id for stream in streams] | |
) | |
) | |
for stream, cached_status in zip( | |
streams, instant_availability_data.get("response") | |
): | |
stream.cached = cached_status | |
except ProviderException: | |
pass | |
import logging | |
async def update_pm_cache_status( | |
streams: list[TorrentStreams], user_data: UserData, **kwargs | |
): | |
"""Updates the cache status of streams based on Premiumize's instant availability.""" | |
try: | |
async with Premiumize(token=user_data.streaming_provider.token) as pm_client: | |
instant_availability_data = ( | |
await pm_client.get_torrent_instant_availability( | |
[stream.id for stream in streams] | |
) | |
) | |
for stream, cached_status in zip( | |
streams, instant_availability_data.get("response"), | |
strict=True | |
): | |
stream.cached = cached_status | |
except ProviderException: | |
logging.error("Failed to update PM cache status", exc_info=True) |
async def _make_request( | ||
self, | ||
method: str, | ||
url: str, | ||
data: Optional[dict | str] = None, | ||
json: Optional[dict] = None, | ||
params: Optional[dict] = None, | ||
is_return_none: bool = False, | ||
is_expected_to_fail: bool = False, | ||
retry_count: int = 0, | ||
) -> dict | list: | ||
params = params or {} | ||
if self.is_private_token: | ||
params["apikey"] = self.token | ||
return await super()._make_request( | ||
method, url, data, json, params, is_return_none, is_expected_to_fail | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove unused retry_count
parameter.
The retry_count
parameter is defined but never used in the method implementation.
Apply this diff to remove the unused parameter:
async def _make_request(
self,
method: str,
url: str,
data: Optional[dict | str] = None,
json: Optional[dict] = None,
params: Optional[dict] = None,
is_return_none: bool = False,
is_expected_to_fail: bool = False,
- retry_count: int = 0,
) -> dict | list:
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async def _make_request( | |
self, | |
method: str, | |
url: str, | |
data: Optional[dict | str] = None, | |
json: Optional[dict] = None, | |
params: Optional[dict] = None, | |
is_return_none: bool = False, | |
is_expected_to_fail: bool = False, | |
retry_count: int = 0, | |
) -> dict | list: | |
params = params or {} | |
if self.is_private_token: | |
params["apikey"] = self.token | |
return await super()._make_request( | |
method, url, data, json, params, is_return_none, is_expected_to_fail | |
) | |
async def _make_request( | |
self, | |
method: str, | |
url: str, | |
data: Optional[dict | str] = None, | |
json: Optional[dict] = None, | |
params: Optional[dict] = None, | |
is_return_none: bool = False, | |
is_expected_to_fail: bool = False, | |
) -> dict | list: | |
params = params or {} | |
if self.is_private_token: | |
params["apikey"] = self.token | |
return await super()._make_request( | |
method, url, data, json, params, is_return_none, is_expected_to_fail | |
) |
streaming_providers/seedr/utils.py
Outdated
async def wait_for_completion( | ||
seedr: Seedr, info_hash: str, max_retries: int = 1, retry_interval: int = 1 | ||
) -> None: | ||
"""Wait for torrent to complete downloading.""" | ||
for _ in range(max_retries): | ||
status, data = await check_torrent_status(seedr, info_hash) | ||
|
||
if status == TorrentStatus.COMPLETED: | ||
return | ||
elif status == TorrentStatus.DOWNLOADING and data.get("progress") == "100": | ||
return | ||
|
||
await asyncio.sleep(retry_interval) | ||
|
||
raise ProviderException("Torrent not downloaded yet.", "torrent_not_downloaded.mp4") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider increasing retries and implementing exponential backoff in wait_for_completion
The current max_retries
value of 1 and retry_interval
of 1 second in wait_for_completion
may not be sufficient for torrents that take longer to download. To improve reliability, consider increasing the max_retries
value and implementing exponential backoff to handle varying download times more effectively.
Here's how you might adjust the function:
async def wait_for_completion(
- seedr: Seedr, info_hash: str, max_retries: int = 1, retry_interval: int = 1
+ seedr: Seedr, info_hash: str, max_retries: int = 5, retry_interval: int = 1
) -> None:
"""Wait for torrent to complete downloading."""
for _ in range(max_retries):
status, data = await check_torrent_status(seedr, info_hash)
- if status == TorrentStatus.COMPLETED:
- return
- elif status == TorrentStatus.DOWNLOADING and data.get("progress") == "100":
+ if status == TorrentStatus.COMPLETED or (
+ status == TorrentStatus.DOWNLOADING and data.get("progress") == "100"
+ ):
return
await asyncio.sleep(retry_interval)
+ retry_interval *= 2 # Exponential backoff
raise ProviderException("Torrent not downloaded yet.", "torrent_not_downloaded.mp4")
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async def wait_for_completion( | |
seedr: Seedr, info_hash: str, max_retries: int = 1, retry_interval: int = 1 | |
) -> None: | |
"""Wait for torrent to complete downloading.""" | |
for _ in range(max_retries): | |
status, data = await check_torrent_status(seedr, info_hash) | |
if status == TorrentStatus.COMPLETED: | |
return | |
elif status == TorrentStatus.DOWNLOADING and data.get("progress") == "100": | |
return | |
await asyncio.sleep(retry_interval) | |
raise ProviderException("Torrent not downloaded yet.", "torrent_not_downloaded.mp4") | |
async def wait_for_completion( | |
seedr: Seedr, info_hash: str, max_retries: int = 5, retry_interval: int = 1 | |
) -> None: | |
"""Wait for torrent to complete downloading.""" | |
for _ in range(max_retries): | |
status, data = await check_torrent_status(seedr, info_hash) | |
if status == TorrentStatus.COMPLETED or ( | |
status == TorrentStatus.DOWNLOADING and data.get("progress") == "100" | |
): | |
return | |
await asyncio.sleep(retry_interval) | |
retry_interval *= 2 # Exponential backoff | |
raise ProviderException("Torrent not downloaded yet.", "torrent_not_downloaded.mp4") |
🧰 Tools
🪛 Ruff
150-153: Combine if
branches using logical or
operator
Combine if
branches
(SIM114)
streaming_providers/seedr/utils.py
Outdated
async def clean_names(seedr: Seedr, folder_id: str) -> None: | ||
"""Clean special characters from all file and folder names.""" | ||
content = await seedr.list_contents(folder_id) | ||
|
||
for file in content["files"]: | ||
clean_name = clean_filename(file["name"]) | ||
if file["name"] != clean_name: | ||
await seedr.rename_item(file["folder_file_id"], clean_name, "file") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ensure clean_names
function cleans folder names as well
The clean_names
function's docstring indicates it cleans special characters from all file and folder names. Currently, it only processes files and does not clean folder names. To fully implement the intended functionality, consider adding logic to clean folder names as well.
Apply the following changes to include folder names:
async def clean_names(seedr: Seedr, folder_id: str) -> None:
"""Clean special characters from all file and folder names."""
content = await seedr.list_contents(folder_id)
+ # Clean subfolder names
+ for subfolder in content["folders"]:
+ clean_name = clean_filename(subfolder["name"])
+ if subfolder["name"] != clean_name:
+ await seedr.rename_item(subfolder["id"], clean_name, "folder")
# Clean file names
for file in content["files"]:
clean_name = clean_filename(file["name"])
if file["name"] != clean_name:
await seedr.rename_item(file["folder_file_id"], clean_name, "file")
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async def clean_names(seedr: Seedr, folder_id: str) -> None: | |
"""Clean special characters from all file and folder names.""" | |
content = await seedr.list_contents(folder_id) | |
for file in content["files"]: | |
clean_name = clean_filename(file["name"]) | |
if file["name"] != clean_name: | |
await seedr.rename_item(file["folder_file_id"], clean_name, "file") | |
async def clean_names(seedr: Seedr, folder_id: str) -> None: | |
"""Clean special characters from all file and folder names.""" | |
content = await seedr.list_contents(folder_id) | |
# Clean subfolder names | |
for subfolder in content["folders"]: | |
clean_name = clean_filename(subfolder["name"]) | |
if subfolder["name"] != clean_name: | |
await seedr.rename_item(subfolder["id"], clean_name, "folder") | |
# Clean file names | |
for file in content["files"]: | |
clean_name = clean_filename(file["name"]) | |
if file["name"] != clean_name: | |
await seedr.rename_item(file["folder_file_id"], clean_name, "file") |
return selected_file | ||
|
||
|
||
@asynccontextmanager | ||
async def initialize_qbittorrent(user_data: UserData) -> APIClient: | ||
async with ClientSession(cookie_jar=CookieJar(unsafe=True)) as session: | ||
async with ClientSession(cookie_jar=CookieJar(unsafe=True), timeout=18) as session: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Security Concern: Avoid using unsafe=True
in CookieJar
.
Setting unsafe=True
in CookieJar
allows insecure cookies, which can pose security risks by accepting cookies from non-HTTPS sources. Consider removing unsafe=True
to enforce secure cookie handling, unless there's a specific need for it.
await set_qbittorrent_preferences(qbittorrent, indexer_type) | ||
await add_magnet(qbittorrent, magnet_link, info_hash, user_data) | ||
await wait_for_torrent_to_complete( | ||
qbittorrent, info_hash, play_video_after, max_retries, retry_interval | ||
) | ||
selected_file = await find_file_in_folder_tree( | ||
webdav, user_data, info_hash, filename | ||
) | ||
if selected_file is None: | ||
raise ProviderException( | ||
"No matching file available for this torrent", "no_matching_file.mp4" | ||
webdav, user_data, info_hash, filename, episode | ||
) | ||
if not selected_file: | ||
raise ProviderException( | ||
"No matching file available for this torrent", "no_matching_file.mp4" | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Handle exceptions in retrieve_or_download_file
more specifically.
When raising a ProviderException
, consider providing more context or handling specific exceptions to aid in debugging and user feedback.
selected_file_index = await select_file_index_from_torrent( | ||
{"files": files}, filename, episode | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Simplify argument passing to select_file_index_from_torrent
.
Instead of wrapping files
in a dictionary, you can pass the list directly if the function accepts it. This makes the code cleaner:
-selected_file_index = await select_file_index_from_torrent(
- {"files": files}, filename, episode
-)
+selected_file_index = await select_file_index_from_torrent(
+ files, filename, episode
+)
Committable suggestion skipped: line range outside the PR's diff.
@mhdzumair: I have closed all current reviews and will run a fresh review on this PR. ✅ Actions performedComments resolved. Review triggered.
|
Summary by CodeRabbit
Release Notes
New Features
Bug Fixes
Documentation
These enhancements significantly improve the efficiency and responsiveness of the application when interacting with streaming services.