fix: use singleton in llama_cpp #1013
Merged
Codecov / codecov/project
succeeded
Jun 25, 2024 in 0s
21.83% (+0.14%) compared to 651eb33
View this Pull Request on Codecov
21.83% (+0.14%) compared to 651eb33
Details
Codecov Report
Attention: Patch coverage is 38.09524%
with 13 lines
in your changes missing coverage. Please review.
Project coverage is 21.83%. Comparing base (
651eb33
) to head (5aa8871
).
Report is 7 commits behind head on main.
Files | Patch % | Lines |
---|---|---|
application/llm/llama_cpp.py | 38.09% | 13 Missing |
Additional details and impacted files
@@ Coverage Diff @@
## main #1013 +/- ##
==========================================
+ Coverage 21.69% 21.83% +0.14%
==========================================
Files 80 80
Lines 3632 3645 +13
==========================================
+ Hits 788 796 +8
- Misses 2844 2849 +5
☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.
Loading