Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Draft]合并上游 #7

Open
wants to merge 201 commits into
base: devel
Choose a base branch
from
Open

Conversation

zaoqi
Copy link

@zaoqi zaoqi commented Oct 2, 2019

No description provided.

angristan and others added 30 commits March 10, 2018 15:58
Use new LABEL syntax for Dockerfile
Searching for english is now giving all pages results.
Instead searching for a specific language different from english
is correctly returning only translated pages for the selected language.
Signed-off-by: Robin Hallabro <[email protected]>
- Fix for docker image build error "Could not find a version that satisfies the requirement cffi!=1.11.3,>=1.7 (from versions: )"
…searx#1399)

Instead of a single line with 500000 characters use nicely formatted JSON.
Sort the lists in engine_languages.py so when updating it is possible to
more easily see the differences (search engines do change the order their
languages are listed in)
- Forget one "\" in the process!
Yetangitu and others added 30 commits July 27, 2019 17:49
use JSON where possible, compose 'content' using all available data, use correct 'url' (direct to source instead of redirect through bing)
[fix] Small fixes in Preferences view's text
…_format', 'source', etc. (searx#1571)

Fetch complete JSON data block, use legend to extract images. 
Unquote urlencoded strings.
Add image description as 'content'. 
Add 'img_format' and 'source' data (needs PR searx#1567 to enable this data to be displayed). 
Show images which lack ownerid instead of discarding them.
* Search URL is https://www.wikidata.org/w/index.php?{query}&ns0=1 (with ns0=1 at the end to avoid an HTTP redirection)
* url_detail: remove the disabletidy=1 deprecated parameter
* Add eval_xpath function: compile once for all xpath.
* Add get_id_cache: retrieve all HTML with an id, avoid the slow to procress dynamic xpath '//div[@id="{propertyid}"]'.replace('{propertyid}')
* Create an etree.HTMLParser() instead of using the global one (see searx#1575)
[fix] wikidata engine: faster processing, remove one HTTP redirection.
Fix dailymotion, google_videos and youtube_noapi engines
Characters that were not ASCII were incorrectly decoded.
Add an helper function: searx.utils.ecma_unescape (Python implementation of unescape Javascript function).
The new url parameter "timeout_limit" set timeout limit defined in second.
Example "timeout_limit=1.5" means the timeout limit is 1.5 seconds.

In addition, the query can start with <[number] to set the timeout limit.

For number between 0 and 99, the unit is the second :
Example: "<30 searx" means the timeout limit is 3 seconds

For number above 100, the unit is the millisecond:
Example: "<850 searx" means the timeout is 850 milliseconds.

In addition, there is a new optional setting: outgoing.max_request_timeout.
If not set, the user timeout can't go above searx configuration (as before: the max timeout of selected engine for a query).

If the value is set, the user can set a timeout between 0 and max_request_timeout using
<[number] or timeout_limit query parameter.

Related to searx#1077
Updated version of PR searx#1413 from @isj-privacore
[fix] fix paging for the oscar theme after PR searx#1640
Close searx#1664
at the end of test_webapp.py, the monkey patch of searx.search.Search was not revert which lead to side effects on other tests
close searx#1663
…he new version (searx#1668)

Before this commit, the existing settings.yml were always replaced.
before this commit, sometimes there are no results
use a generic user-agent instead of one with the OS "Windows NT 6.3; WOW64"
This PR fixes the result count from bing which was throwing an (hidden) error and add a validation to avoid reading more results than avalaible.

For example :
If there is 100 results from some search and we try to get results from 120 to 130, Bing will send back the results from 0 to 10 and no error. If we compare results count with the first parameter of the request we can avoid this "invalid" results.
Add image format and source information to display - needs changes to engines to actually display something. 

Displays result.source (website from which the image was taken) and result.img_format (image type and size).

Result is styled with result-format and result-source classes. See PR searx#1566 for an example of an engine which has the necessary changes.

Strip <span class="highlight">...</span> in the oscar image template.
No inline script for oscar and simple theme
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.